Mar 14 08:27:43 crc systemd[1]: Starting Kubernetes Kubelet... Mar 14 08:27:43 crc restorecon[4679]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:43 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 14 08:27:44 crc restorecon[4679]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 14 08:27:45 crc kubenswrapper[4886]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:27:45 crc kubenswrapper[4886]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 14 08:27:45 crc kubenswrapper[4886]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:27:45 crc kubenswrapper[4886]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:27:45 crc kubenswrapper[4886]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 14 08:27:45 crc kubenswrapper[4886]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.150158 4886 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.155998 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156035 4886 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156047 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156057 4886 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156066 4886 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156075 4886 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156091 4886 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156101 4886 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156110 4886 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156123 4886 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156162 4886 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156170 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156178 4886 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156186 4886 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156194 4886 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156203 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156211 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156219 4886 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156227 4886 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156234 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156245 4886 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156255 4886 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156265 4886 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156275 4886 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156284 4886 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156292 4886 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156300 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156308 4886 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156316 4886 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156326 4886 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156367 4886 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156376 4886 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156384 4886 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156392 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156400 4886 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156408 4886 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156415 4886 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156424 4886 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156432 4886 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156440 4886 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156449 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156461 4886 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156471 4886 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156481 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156489 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156497 4886 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156505 4886 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156513 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156522 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156530 4886 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156537 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156544 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156552 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156559 4886 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156567 4886 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156579 4886 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156588 4886 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156597 4886 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156606 4886 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156614 4886 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156623 4886 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156631 4886 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156639 4886 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156648 4886 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156658 4886 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156666 4886 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156676 4886 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156686 4886 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156696 4886 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156706 4886 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.156715 4886 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156882 4886 flags.go:64] FLAG: --address="0.0.0.0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156900 4886 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156915 4886 flags.go:64] FLAG: --anonymous-auth="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156927 4886 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156938 4886 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156949 4886 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156961 4886 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156972 4886 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156981 4886 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.156990 4886 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157000 4886 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157010 4886 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157019 4886 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157028 4886 flags.go:64] FLAG: --cgroup-root="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157037 4886 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157046 4886 flags.go:64] FLAG: --client-ca-file="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157055 4886 flags.go:64] FLAG: --cloud-config="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157063 4886 flags.go:64] FLAG: --cloud-provider="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157072 4886 flags.go:64] FLAG: --cluster-dns="[]" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157083 4886 flags.go:64] FLAG: --cluster-domain="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157091 4886 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157101 4886 flags.go:64] FLAG: --config-dir="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157109 4886 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157126 4886 flags.go:64] FLAG: --container-log-max-files="5" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157171 4886 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157183 4886 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157195 4886 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157207 4886 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157218 4886 flags.go:64] FLAG: --contention-profiling="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157229 4886 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157241 4886 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157251 4886 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157265 4886 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157278 4886 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157290 4886 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157301 4886 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157314 4886 flags.go:64] FLAG: --enable-load-reader="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157326 4886 flags.go:64] FLAG: --enable-server="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157337 4886 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157355 4886 flags.go:64] FLAG: --event-burst="100" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157366 4886 flags.go:64] FLAG: --event-qps="50" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157378 4886 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157388 4886 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157399 4886 flags.go:64] FLAG: --eviction-hard="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157415 4886 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157426 4886 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157438 4886 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157451 4886 flags.go:64] FLAG: --eviction-soft="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157463 4886 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157474 4886 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157485 4886 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157494 4886 flags.go:64] FLAG: --experimental-mounter-path="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157504 4886 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157515 4886 flags.go:64] FLAG: --fail-swap-on="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157527 4886 flags.go:64] FLAG: --feature-gates="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157541 4886 flags.go:64] FLAG: --file-check-frequency="20s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157553 4886 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157566 4886 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157579 4886 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157592 4886 flags.go:64] FLAG: --healthz-port="10248" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157604 4886 flags.go:64] FLAG: --help="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157617 4886 flags.go:64] FLAG: --hostname-override="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157628 4886 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157641 4886 flags.go:64] FLAG: --http-check-frequency="20s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157654 4886 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157666 4886 flags.go:64] FLAG: --image-credential-provider-config="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157677 4886 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157689 4886 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157700 4886 flags.go:64] FLAG: --image-service-endpoint="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157712 4886 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157722 4886 flags.go:64] FLAG: --kube-api-burst="100" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157734 4886 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157746 4886 flags.go:64] FLAG: --kube-api-qps="50" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157756 4886 flags.go:64] FLAG: --kube-reserved="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157769 4886 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157779 4886 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157790 4886 flags.go:64] FLAG: --kubelet-cgroups="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157804 4886 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157816 4886 flags.go:64] FLAG: --lock-file="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157826 4886 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157838 4886 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157849 4886 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157867 4886 flags.go:64] FLAG: --log-json-split-stream="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157879 4886 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157890 4886 flags.go:64] FLAG: --log-text-split-stream="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157901 4886 flags.go:64] FLAG: --logging-format="text" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157914 4886 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157927 4886 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157939 4886 flags.go:64] FLAG: --manifest-url="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157950 4886 flags.go:64] FLAG: --manifest-url-header="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157965 4886 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157977 4886 flags.go:64] FLAG: --max-open-files="1000000" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.157991 4886 flags.go:64] FLAG: --max-pods="110" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158003 4886 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158014 4886 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158025 4886 flags.go:64] FLAG: --memory-manager-policy="None" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158038 4886 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158050 4886 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158062 4886 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158074 4886 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158101 4886 flags.go:64] FLAG: --node-status-max-images="50" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158112 4886 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158162 4886 flags.go:64] FLAG: --oom-score-adj="-999" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158176 4886 flags.go:64] FLAG: --pod-cidr="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158187 4886 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158207 4886 flags.go:64] FLAG: --pod-manifest-path="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158219 4886 flags.go:64] FLAG: --pod-max-pids="-1" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158231 4886 flags.go:64] FLAG: --pods-per-core="0" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158242 4886 flags.go:64] FLAG: --port="10250" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158254 4886 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158265 4886 flags.go:64] FLAG: --provider-id="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158276 4886 flags.go:64] FLAG: --qos-reserved="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158290 4886 flags.go:64] FLAG: --read-only-port="10255" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158303 4886 flags.go:64] FLAG: --register-node="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158315 4886 flags.go:64] FLAG: --register-schedulable="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158326 4886 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158345 4886 flags.go:64] FLAG: --registry-burst="10" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158357 4886 flags.go:64] FLAG: --registry-qps="5" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158369 4886 flags.go:64] FLAG: --reserved-cpus="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158380 4886 flags.go:64] FLAG: --reserved-memory="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158395 4886 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158406 4886 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158418 4886 flags.go:64] FLAG: --rotate-certificates="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158429 4886 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158440 4886 flags.go:64] FLAG: --runonce="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158451 4886 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158463 4886 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158476 4886 flags.go:64] FLAG: --seccomp-default="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158488 4886 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158499 4886 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158512 4886 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158523 4886 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158535 4886 flags.go:64] FLAG: --storage-driver-password="root" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158547 4886 flags.go:64] FLAG: --storage-driver-secure="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158558 4886 flags.go:64] FLAG: --storage-driver-table="stats" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158569 4886 flags.go:64] FLAG: --storage-driver-user="root" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158580 4886 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158592 4886 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158603 4886 flags.go:64] FLAG: --system-cgroups="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158615 4886 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158634 4886 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158645 4886 flags.go:64] FLAG: --tls-cert-file="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158656 4886 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158670 4886 flags.go:64] FLAG: --tls-min-version="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158682 4886 flags.go:64] FLAG: --tls-private-key-file="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158693 4886 flags.go:64] FLAG: --topology-manager-policy="none" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158705 4886 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158716 4886 flags.go:64] FLAG: --topology-manager-scope="container" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158728 4886 flags.go:64] FLAG: --v="2" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158747 4886 flags.go:64] FLAG: --version="false" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158763 4886 flags.go:64] FLAG: --vmodule="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158778 4886 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.158790 4886 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159070 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159087 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159101 4886 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159114 4886 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159124 4886 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159173 4886 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159184 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159195 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159205 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159215 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159225 4886 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159235 4886 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159246 4886 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159255 4886 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159265 4886 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159274 4886 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159285 4886 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159294 4886 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159304 4886 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159314 4886 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159323 4886 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159333 4886 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159342 4886 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159350 4886 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159357 4886 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159375 4886 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159385 4886 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159395 4886 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159403 4886 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159411 4886 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159419 4886 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159427 4886 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159437 4886 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159445 4886 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159452 4886 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159460 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159468 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159476 4886 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159484 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159492 4886 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159499 4886 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159507 4886 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159515 4886 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159522 4886 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159530 4886 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159538 4886 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159545 4886 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159553 4886 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159561 4886 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159569 4886 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159577 4886 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159585 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159593 4886 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159604 4886 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159615 4886 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159625 4886 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159635 4886 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159647 4886 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159656 4886 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159665 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159673 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159681 4886 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159689 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159696 4886 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159704 4886 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159712 4886 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159719 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159727 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159736 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159744 4886 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.159752 4886 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.160634 4886 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.177682 4886 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.177797 4886 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.177965 4886 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.177983 4886 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.177993 4886 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178004 4886 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178013 4886 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178023 4886 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178031 4886 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178039 4886 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178047 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178055 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178064 4886 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178072 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178080 4886 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178089 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178098 4886 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178111 4886 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178155 4886 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178167 4886 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178176 4886 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178185 4886 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178194 4886 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178202 4886 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178210 4886 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178218 4886 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178227 4886 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178237 4886 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178249 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178261 4886 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178272 4886 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178279 4886 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178288 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178296 4886 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178303 4886 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178311 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178319 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178328 4886 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178337 4886 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178345 4886 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178353 4886 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178361 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178368 4886 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178376 4886 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178384 4886 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178392 4886 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178400 4886 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178408 4886 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178416 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178425 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178433 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178440 4886 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178448 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178457 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178465 4886 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178472 4886 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178480 4886 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178492 4886 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178500 4886 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178508 4886 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178516 4886 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178523 4886 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178534 4886 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178547 4886 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178557 4886 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178566 4886 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178574 4886 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178583 4886 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178596 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178608 4886 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178618 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178628 4886 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178639 4886 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.178657 4886 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178903 4886 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178922 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178931 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178939 4886 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178948 4886 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178956 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178967 4886 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178979 4886 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178988 4886 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.178996 4886 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179005 4886 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179013 4886 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179021 4886 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179028 4886 feature_gate.go:330] unrecognized feature gate: Example Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179036 4886 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179046 4886 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179054 4886 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179062 4886 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179072 4886 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179083 4886 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179093 4886 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179102 4886 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179111 4886 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179125 4886 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179204 4886 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179217 4886 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179226 4886 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179235 4886 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179243 4886 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179253 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179263 4886 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179273 4886 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179283 4886 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179293 4886 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179302 4886 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179311 4886 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179321 4886 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179331 4886 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179342 4886 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179354 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179365 4886 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179376 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179384 4886 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179392 4886 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179399 4886 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179407 4886 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179415 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179423 4886 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179432 4886 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179440 4886 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179448 4886 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179456 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179464 4886 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179472 4886 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179479 4886 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179487 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179495 4886 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179503 4886 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179510 4886 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179518 4886 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179526 4886 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179534 4886 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179542 4886 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179549 4886 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179557 4886 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179565 4886 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179573 4886 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179580 4886 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179588 4886 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179597 4886 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.179607 4886 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.179622 4886 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.181070 4886 server.go:940] "Client rotation is on, will bootstrap in background" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.187668 4886 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.193815 4886 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.193960 4886 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.195762 4886 server.go:997] "Starting client certificate rotation" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.195786 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.196003 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.219668 4886 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.221425 4886 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.224991 4886 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.243894 4886 log.go:25] "Validated CRI v1 runtime API" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.290511 4886 log.go:25] "Validated CRI v1 image API" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.292386 4886 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.300580 4886 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-14-08-22-47-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.300619 4886 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.320088 4886 manager.go:217] Machine: {Timestamp:2026-03-14 08:27:45.315628032 +0000 UTC m=+0.564079679 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:44063244-4752-49bf-ae05-9c5105dcb9bb BootID:5e41e15a-ef8c-4636-88a0-58cc60240a23 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:df:8f:f5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:df:8f:f5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1b:75:a6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8c:ca:52 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:13:16:2c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:56:53:25 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6e:e6:2d:15:10:70 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:66:10:e7:9b:48:5c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.320431 4886 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.320674 4886 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.324210 4886 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.324419 4886 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.324463 4886 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.324721 4886 topology_manager.go:138] "Creating topology manager with none policy" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.324733 4886 container_manager_linux.go:303] "Creating device plugin manager" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.325299 4886 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.325333 4886 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.325484 4886 state_mem.go:36] "Initialized new in-memory state store" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.325567 4886 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.329924 4886 kubelet.go:418] "Attempting to sync node with API server" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.329947 4886 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.329970 4886 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.329984 4886 kubelet.go:324] "Adding apiserver pod source" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.330001 4886 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.333823 4886 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.335559 4886 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.337571 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.337636 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.337950 4886 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.337953 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.338223 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339860 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339885 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339893 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339901 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339912 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339920 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339928 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339940 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339957 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339964 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339988 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.339995 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.341513 4886 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.342016 4886 server.go:1280] "Started kubelet" Mar 14 08:27:45 crc systemd[1]: Started Kubernetes Kubelet. Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.344210 4886 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.344328 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.344597 4886 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.345050 4886 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.345933 4886 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.345989 4886 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.346342 4886 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.346381 4886 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.346374 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.346693 4886 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.347065 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.347174 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.348124 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.350068 4886 factory.go:55] Registering systemd factory Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.350114 4886 factory.go:221] Registration of the systemd container factory successfully Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.351350 4886 factory.go:153] Registering CRI-O factory Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.351387 4886 factory.go:221] Registration of the crio container factory successfully Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.351558 4886 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.351618 4886 factory.go:103] Registering Raw factory Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.351649 4886 manager.go:1196] Started watching for new ooms in manager Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.352524 4886 server.go:460] "Adding debug handlers to kubelet server" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.352940 4886 manager.go:319] Starting recovery of all containers Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.352257 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca7d01014f312 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,LastTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.363853 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.363973 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364006 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364037 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364067 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364092 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364123 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364180 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364214 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364239 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364269 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364298 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364332 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364362 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364388 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364415 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364442 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364470 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364497 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364525 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364555 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364590 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364624 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364655 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364686 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364718 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364749 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364776 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364798 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364819 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364842 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364863 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364886 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.364908 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.368600 4886 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369083 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369126 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369176 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369196 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369219 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369242 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369263 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369286 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369312 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369337 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369359 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369384 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369404 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369427 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369450 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369495 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369519 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369542 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369570 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369629 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369655 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369677 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369699 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369722 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369744 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369772 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369793 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369815 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369836 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369856 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369879 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369900 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369922 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369943 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369965 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.369985 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370006 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370027 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370053 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370080 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370111 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370165 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370189 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370211 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370231 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370251 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370274 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370294 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370317 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370338 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370364 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370388 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370849 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370893 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370915 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370934 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370955 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.370978 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371013 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371041 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371068 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371104 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371155 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371175 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371194 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371212 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371232 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371253 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371271 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371289 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371317 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371339 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371363 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371390 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371411 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371434 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371457 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371477 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371498 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371518 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371540 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371560 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371580 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371606 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371626 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371645 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371663 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371684 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371706 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371729 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371749 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371769 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371788 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371808 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371828 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371847 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371865 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371886 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371903 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371927 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371947 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371967 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.371986 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372007 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372029 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372048 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372068 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372087 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372104 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372155 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372178 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372199 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372218 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372236 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372258 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372276 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372295 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372314 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372334 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372353 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372371 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372392 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372413 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372431 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372451 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372471 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372489 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372508 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372527 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372547 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372567 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372590 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372609 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372630 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372649 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372669 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372687 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372709 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372731 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372753 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372772 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372801 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372820 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372840 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372861 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372881 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372901 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372921 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372941 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372960 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.372980 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373003 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373022 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373045 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373066 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373084 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373108 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373158 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373179 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373199 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373222 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373242 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373260 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373282 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373304 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373326 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373348 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373370 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373390 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373410 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373429 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373448 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373467 4886 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373485 4886 reconstruct.go:97] "Volume reconstruction finished" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373502 4886 reconciler.go:26] "Reconciler: start to sync state" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.373924 4886 manager.go:324] Recovery completed Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.388349 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.390866 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.390919 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.390932 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.391919 4886 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.391951 4886 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.391994 4886 state_mem.go:36] "Initialized new in-memory state store" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.409848 4886 policy_none.go:49] "None policy: Start" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.412693 4886 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.412744 4886 state_mem.go:35] "Initializing new in-memory state store" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.417234 4886 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.419290 4886 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.419385 4886 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.419441 4886 kubelet.go:2335] "Starting kubelet main sync loop" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.419536 4886 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.420257 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.420418 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.447600 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.463207 4886 manager.go:334] "Starting Device Plugin manager" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.463264 4886 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.463278 4886 server.go:79] "Starting device plugin registration server" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.463713 4886 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.463732 4886 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.465507 4886 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.465696 4886 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.465706 4886 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.481300 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.520518 4886 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.520695 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.522224 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.522285 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.522300 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.522565 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.522764 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.522804 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523707 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523737 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523752 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523720 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523871 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523887 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.523893 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.524047 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.524085 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.524856 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.524888 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.524900 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.525059 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.525251 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.525282 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.525293 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.525508 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.525596 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526005 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526051 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526069 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526312 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526612 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526722 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.526974 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527003 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527014 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527303 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527335 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527351 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527581 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.527661 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.528242 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.528271 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.528283 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.528847 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.528886 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.528900 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.549822 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.564232 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.565664 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.565699 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.565712 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.565738 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.566225 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577426 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577481 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577518 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577551 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577594 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577635 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577666 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577708 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577771 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577812 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577921 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.577983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.578004 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.578019 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.578036 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679477 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679552 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679577 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679597 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679621 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679644 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679649 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679667 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679689 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679712 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679732 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679777 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679785 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679763 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679826 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679835 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679751 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679801 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679879 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679883 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679893 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679903 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679911 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679847 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.679933 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.680006 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.680055 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.680024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.680009 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.767501 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.769274 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.769329 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.769340 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.769368 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.770433 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.868077 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.869254 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.878325 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.884938 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: I0314 08:27:45.901794 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.911828 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-54f50a33a17b3fe9445065fd1db5a27816dc84ef005ad6f2b17706d0d50757c9 WatchSource:0}: Error finding container 54f50a33a17b3fe9445065fd1db5a27816dc84ef005ad6f2b17706d0d50757c9: Status 404 returned error can't find the container with id 54f50a33a17b3fe9445065fd1db5a27816dc84ef005ad6f2b17706d0d50757c9 Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.913823 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-dc92ccea222e59e6440701c43ee5ae9754f8e298aafa5dcfc7cf74509393de4f WatchSource:0}: Error finding container dc92ccea222e59e6440701c43ee5ae9754f8e298aafa5dcfc7cf74509393de4f: Status 404 returned error can't find the container with id dc92ccea222e59e6440701c43ee5ae9754f8e298aafa5dcfc7cf74509393de4f Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.917900 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-653937c8465a66bc2719fbdc09a7c740c61f7760013fdd837aa7b7bf59b4d1ef WatchSource:0}: Error finding container 653937c8465a66bc2719fbdc09a7c740c61f7760013fdd837aa7b7bf59b4d1ef: Status 404 returned error can't find the container with id 653937c8465a66bc2719fbdc09a7c740c61f7760013fdd837aa7b7bf59b4d1ef Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.918801 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e0f9710c5fce08ee202805c14be439a78107c8f2179818fe41a9db5bc2dbf231 WatchSource:0}: Error finding container e0f9710c5fce08ee202805c14be439a78107c8f2179818fe41a9db5bc2dbf231: Status 404 returned error can't find the container with id e0f9710c5fce08ee202805c14be439a78107c8f2179818fe41a9db5bc2dbf231 Mar 14 08:27:45 crc kubenswrapper[4886]: W0314 08:27:45.923915 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ec044785930c9834badd5dcf49e678bdb1ddc32749c407393d281084d0316166 WatchSource:0}: Error finding container ec044785930c9834badd5dcf49e678bdb1ddc32749c407393d281084d0316166: Status 404 returned error can't find the container with id ec044785930c9834badd5dcf49e678bdb1ddc32749c407393d281084d0316166 Mar 14 08:27:45 crc kubenswrapper[4886]: E0314 08:27:45.951171 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.171442 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.173157 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.173449 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.173462 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.173491 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.174015 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Mar 14 08:27:46 crc kubenswrapper[4886]: W0314 08:27:46.221971 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.222071 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:46 crc kubenswrapper[4886]: W0314 08:27:46.255434 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.255585 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.345943 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.425666 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dc92ccea222e59e6440701c43ee5ae9754f8e298aafa5dcfc7cf74509393de4f"} Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.426891 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"54f50a33a17b3fe9445065fd1db5a27816dc84ef005ad6f2b17706d0d50757c9"} Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.428529 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ec044785930c9834badd5dcf49e678bdb1ddc32749c407393d281084d0316166"} Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.429766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e0f9710c5fce08ee202805c14be439a78107c8f2179818fe41a9db5bc2dbf231"} Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.431098 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"653937c8465a66bc2719fbdc09a7c740c61f7760013fdd837aa7b7bf59b4d1ef"} Mar 14 08:27:46 crc kubenswrapper[4886]: W0314 08:27:46.704736 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.704853 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:46 crc kubenswrapper[4886]: W0314 08:27:46.738877 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.739004 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.752231 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.974949 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.977268 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.977319 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.977332 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:46 crc kubenswrapper[4886]: I0314 08:27:46.977364 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:27:46 crc kubenswrapper[4886]: E0314 08:27:46.977971 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.345384 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.345888 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:47 crc kubenswrapper[4886]: E0314 08:27:47.346510 4886 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.436060 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b" exitCode=0 Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.436169 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.436367 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.438072 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.438101 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.438111 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.440057 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441296 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441346 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441365 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441391 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441411 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441953 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.441999 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.442017 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.442182 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.442211 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.442228 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.443329 4886 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd" exitCode=0 Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.443438 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.443783 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.444873 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.444898 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.444910 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.446818 4886 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ffd1d57d01e14b536e2f7e73749880714261d9b555a735783fbe83e3c916fa4b" exitCode=0 Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.446874 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ffd1d57d01e14b536e2f7e73749880714261d9b555a735783fbe83e3c916fa4b"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.446961 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.447811 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.447831 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.447839 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.474151 4886 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0" exitCode=0 Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.474211 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0"} Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.474314 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.475551 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.475579 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:47 crc kubenswrapper[4886]: I0314 08:27:47.475590 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:47 crc kubenswrapper[4886]: E0314 08:27:47.498954 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ca7d01014f312 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,LastTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.160269 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.345434 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:48 crc kubenswrapper[4886]: E0314 08:27:48.353284 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="3.2s" Mar 14 08:27:48 crc kubenswrapper[4886]: W0314 08:27:48.441975 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:48 crc kubenswrapper[4886]: E0314 08:27:48.442080 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.479548 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.479590 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.479599 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.479684 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.480519 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.480544 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.480551 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.484811 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.484841 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.484856 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.484867 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.486369 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.486387 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.487286 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.487314 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.487325 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.489525 4886 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="48a0a4e940673ed3e084ff22ae46a7ce47eb7347e46128c22a2cd493f6ed06fe" exitCode=0 Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.489614 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.489626 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"48a0a4e940673ed3e084ff22ae46a7ce47eb7347e46128c22a2cd493f6ed06fe"} Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.489614 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.490290 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.490309 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.490321 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.491077 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.491103 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.491114 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.579044 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.584158 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.584211 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.584227 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:48 crc kubenswrapper[4886]: I0314 08:27:48.584256 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:27:48 crc kubenswrapper[4886]: E0314 08:27:48.585013 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.38:6443: connect: connection refused" node="crc" Mar 14 08:27:48 crc kubenswrapper[4886]: W0314 08:27:48.690095 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:48 crc kubenswrapper[4886]: E0314 08:27:48.690199 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:48 crc kubenswrapper[4886]: W0314 08:27:48.838909 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.38:6443: connect: connection refused Mar 14 08:27:48 crc kubenswrapper[4886]: E0314 08:27:48.839018 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.38:6443: connect: connection refused" logger="UnhandledError" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.499359 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.499443 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"63fd0c517d1855425665715a0bfc99c7e34aa9b8bb9cc61ac8dbe4aa0bb2b27d"} Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.501017 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.501051 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.501060 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.502713 4886 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c376b141f7ed448baed91333cedf7ad2f3f0128ab30b3dc13114b3846fbce3d0" exitCode=0 Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.502887 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.502925 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c376b141f7ed448baed91333cedf7ad2f3f0128ab30b3dc13114b3846fbce3d0"} Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.502961 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.503082 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.502988 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.503084 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504511 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504545 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504552 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504576 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504596 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504577 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504638 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504607 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504557 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504590 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504710 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:49 crc kubenswrapper[4886]: I0314 08:27:49.504720 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.197993 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.205189 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510526 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"22d9a1f47b0821cb4d452045a619d2fb867c15f70cd2093961a35f8989332473"} Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510575 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7519ca7b24c6a8c7c3faa6393f0f02572f9b4ac417e262ddddd94f6297bc5f38"} Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510582 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510608 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510635 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510588 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25ad62f5c26cca64c63d0955b28f8aa8a76ea569ee64db70bc192aea76b1e871"} Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.510704 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"de39a08e4228994a97cd7fdd116b3c79fe5e7a4adf405cfd8103e7a9c37c1931"} Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.511598 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.511627 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.511645 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.511598 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.511687 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.511696 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:50 crc kubenswrapper[4886]: I0314 08:27:50.622296 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.443879 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.522335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b78f0b24729cadb444ece09a4f5e2bc7cc744cd77c9317e32fc4425848c79d79"} Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.522431 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.522547 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.522630 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.522757 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524036 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524165 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524190 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524424 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524524 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524546 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524795 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524897 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.524929 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.567685 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.576367 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.785937 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.789819 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.789911 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.789934 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:51 crc kubenswrapper[4886]: I0314 08:27:51.790004 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.228925 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.526776 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.526835 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.526953 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.526955 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.529836 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.529913 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.529937 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.529949 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.530013 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.530038 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.530369 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.530422 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:52 crc kubenswrapper[4886]: I0314 08:27:52.530441 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.530565 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.531875 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.531949 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.531970 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.622823 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.622974 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.877062 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.877412 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.879607 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.879672 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:53 crc kubenswrapper[4886]: I0314 08:27:53.879694 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.010379 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.010687 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.012991 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.013067 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.013087 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.482334 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.482674 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.484661 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.484722 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.484743 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.726422 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.726680 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.728374 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.728444 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:54 crc kubenswrapper[4886]: I0314 08:27:54.728470 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:55 crc kubenswrapper[4886]: E0314 08:27:55.482302 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:27:59 crc kubenswrapper[4886]: W0314 08:27:59.295435 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.295621 4886 trace.go:236] Trace[1653127497]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (14-Mar-2026 08:27:49.293) (total time: 10002ms): Mar 14 08:27:59 crc kubenswrapper[4886]: Trace[1653127497]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:27:59.295) Mar 14 08:27:59 crc kubenswrapper[4886]: Trace[1653127497]: [10.00214287s] [10.00214287s] END Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.295664 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.346112 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 14 08:27:59 crc kubenswrapper[4886]: W0314 08:27:59.351194 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.351301 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.355284 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 14 08:27:59 crc kubenswrapper[4886]: W0314 08:27:59.355604 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.355713 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.356366 4886 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.356447 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.359059 4886 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.359928 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ca7d01014f312 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,LastTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.361099 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" node="crc" Mar 14 08:27:59 crc kubenswrapper[4886]: W0314 08:27:59.362573 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z Mar 14 08:27:59 crc kubenswrapper[4886]: E0314 08:27:59.362666 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:27:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.362750 4886 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.362801 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.373539 4886 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.373583 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.555400 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.557840 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63fd0c517d1855425665715a0bfc99c7e34aa9b8bb9cc61ac8dbe4aa0bb2b27d" exitCode=255 Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.557906 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"63fd0c517d1855425665715a0bfc99c7e34aa9b8bb9cc61ac8dbe4aa0bb2b27d"} Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.558098 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.558927 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.558956 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.558966 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:27:59 crc kubenswrapper[4886]: I0314 08:27:59.559460 4886 scope.go:117] "RemoveContainer" containerID="63fd0c517d1855425665715a0bfc99c7e34aa9b8bb9cc61ac8dbe4aa0bb2b27d" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.351762 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:00Z is after 2026-02-23T05:33:13Z Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.564330 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.565453 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.567897 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" exitCode=255 Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.567942 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a"} Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.568014 4886 scope.go:117] "RemoveContainer" containerID="63fd0c517d1855425665715a0bfc99c7e34aa9b8bb9cc61ac8dbe4aa0bb2b27d" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.568241 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.569735 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.569771 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.569782 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:00 crc kubenswrapper[4886]: I0314 08:28:00.570460 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:00 crc kubenswrapper[4886]: E0314 08:28:00.570664 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.350268 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:01Z is after 2026-02-23T05:33:13Z Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.443981 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.575243 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.579653 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.581320 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.581363 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.581377 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.582066 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:01 crc kubenswrapper[4886]: E0314 08:28:01.582340 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.613255 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.613545 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.615238 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.615289 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.615315 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:01 crc kubenswrapper[4886]: I0314 08:28:01.632395 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 14 08:28:02 crc kubenswrapper[4886]: I0314 08:28:02.350318 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:02Z is after 2026-02-23T05:33:13Z Mar 14 08:28:02 crc kubenswrapper[4886]: I0314 08:28:02.583222 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:02 crc kubenswrapper[4886]: I0314 08:28:02.584482 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:02 crc kubenswrapper[4886]: I0314 08:28:02.584527 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:02 crc kubenswrapper[4886]: I0314 08:28:02.584543 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:03 crc kubenswrapper[4886]: I0314 08:28:03.350645 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:03Z is after 2026-02-23T05:33:13Z Mar 14 08:28:03 crc kubenswrapper[4886]: W0314 08:28:03.415733 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:03Z is after 2026-02-23T05:33:13Z Mar 14 08:28:03 crc kubenswrapper[4886]: E0314 08:28:03.416148 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 14 08:28:03 crc kubenswrapper[4886]: I0314 08:28:03.623529 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:28:03 crc kubenswrapper[4886]: I0314 08:28:03.623692 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.020669 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.020984 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.023091 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.023273 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.023303 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.024532 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:04 crc kubenswrapper[4886]: E0314 08:28:04.024860 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.029972 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.351522 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:28:04Z is after 2026-02-23T05:33:13Z Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.490198 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.490522 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.492443 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.492636 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.492758 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.589196 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.590754 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.590832 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.590855 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:04 crc kubenswrapper[4886]: I0314 08:28:04.592000 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:04 crc kubenswrapper[4886]: E0314 08:28:04.592391 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:05 crc kubenswrapper[4886]: I0314 08:28:05.353680 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:05 crc kubenswrapper[4886]: E0314 08:28:05.482628 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:28:05 crc kubenswrapper[4886]: I0314 08:28:05.761572 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:05 crc kubenswrapper[4886]: E0314 08:28:05.763237 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:05 crc kubenswrapper[4886]: I0314 08:28:05.763593 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:05 crc kubenswrapper[4886]: I0314 08:28:05.763790 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:05 crc kubenswrapper[4886]: I0314 08:28:05.763957 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:05 crc kubenswrapper[4886]: I0314 08:28:05.764144 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:05 crc kubenswrapper[4886]: E0314 08:28:05.769156 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:06 crc kubenswrapper[4886]: I0314 08:28:06.351426 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:07 crc kubenswrapper[4886]: I0314 08:28:07.353212 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:08 crc kubenswrapper[4886]: I0314 08:28:08.021424 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 14 08:28:08 crc kubenswrapper[4886]: I0314 08:28:08.037824 4886 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 08:28:08 crc kubenswrapper[4886]: I0314 08:28:08.351274 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:08 crc kubenswrapper[4886]: W0314 08:28:08.935462 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 14 08:28:08 crc kubenswrapper[4886]: E0314 08:28:08.936199 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.353916 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.366635 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01014f312 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,LastTimestamp:2026-03-14 08:27:45.341977362 +0000 UTC m=+0.590429009,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.372440 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.372779 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.374626 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.374674 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.374694 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.374711 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: I0314 08:28:09.375782 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.376141 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.381402 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.389019 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.394535 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d017bcc5c8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.470416328 +0000 UTC m=+0.718867965,LastTimestamp:2026-03-14 08:27:45.470416328 +0000 UTC m=+0.718867965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.401331 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.522265634 +0000 UTC m=+0.770717281,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.407985 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.522295265 +0000 UTC m=+0.770746912,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.415599 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d01300076d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.522308455 +0000 UTC m=+0.770760102,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.420412 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.523730456 +0000 UTC m=+0.772182103,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.425116 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.523745126 +0000 UTC m=+0.772196773,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.429857 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d01300076d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.523760327 +0000 UTC m=+0.772211974,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.438762 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.523825178 +0000 UTC m=+0.772276825,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.444639 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.52388816 +0000 UTC m=+0.772339807,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.449407 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d01300076d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.523901841 +0000 UTC m=+0.772353488,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.455699 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.524880508 +0000 UTC m=+0.773332155,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.461222 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.524896339 +0000 UTC m=+0.773347986,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.466828 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d01300076d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.524908049 +0000 UTC m=+0.773359696,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.472517 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.525267359 +0000 UTC m=+0.773718996,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.478885 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.52528894 +0000 UTC m=+0.773740577,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.485447 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d01300076d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.52529836 +0000 UTC m=+0.773749987,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.493108 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.526032831 +0000 UTC m=+0.774484478,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.499078 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.526062172 +0000 UTC m=+0.774513829,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.505652 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d01300076d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d01300076d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390937965 +0000 UTC m=+0.639389602,LastTimestamp:2026-03-14 08:27:45.526078103 +0000 UTC m=+0.774529760,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.510840 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ff7130\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ff7130 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390899504 +0000 UTC m=+0.639351141,LastTimestamp:2026-03-14 08:27:45.526990298 +0000 UTC m=+0.775441935,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.518065 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ca7d012ffdf81\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ca7d012ffdf81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.390927745 +0000 UTC m=+0.639379382,LastTimestamp:2026-03-14 08:27:45.527009319 +0000 UTC m=+0.775460956,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.527380 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca7d0326e9c7d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.918278781 +0000 UTC m=+1.166730448,LastTimestamp:2026-03-14 08:27:45.918278781 +0000 UTC m=+1.166730448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.533929 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d032800203 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.919418883 +0000 UTC m=+1.167870520,LastTimestamp:2026-03-14 08:27:45.919418883 +0000 UTC m=+1.167870520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.540036 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d03298c14e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.921040718 +0000 UTC m=+1.169492355,LastTimestamp:2026-03-14 08:27:45.921040718 +0000 UTC m=+1.169492355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.546165 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d032a198a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.921620134 +0000 UTC m=+1.170071771,LastTimestamp:2026-03-14 08:27:45.921620134 +0000 UTC m=+1.170071771,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.550993 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0332cbb7f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:45.930738559 +0000 UTC m=+1.179190196,LastTimestamp:2026-03-14 08:27:45.930738559 +0000 UTC m=+1.179190196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.557081 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d056493fcd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.519809997 +0000 UTC m=+1.768261634,LastTimestamp:2026-03-14 08:27:46.519809997 +0000 UTC m=+1.768261634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.563490 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca7d0568161f8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.52348876 +0000 UTC m=+1.771940397,LastTimestamp:2026-03-14 08:27:46.52348876 +0000 UTC m=+1.771940397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.570305 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d056a0927d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.525532797 +0000 UTC m=+1.773984434,LastTimestamp:2026-03-14 08:27:46.525532797 +0000 UTC m=+1.773984434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.577546 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d056a789f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.52598936 +0000 UTC m=+1.774440997,LastTimestamp:2026-03-14 08:27:46.52598936 +0000 UTC m=+1.774440997,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.584047 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d056b0e1f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.526601717 +0000 UTC m=+1.775053354,LastTimestamp:2026-03-14 08:27:46.526601717 +0000 UTC m=+1.775053354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.588438 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d056d4b81f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.528950303 +0000 UTC m=+1.777401930,LastTimestamp:2026-03-14 08:27:46.528950303 +0000 UTC m=+1.777401930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.593231 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d056eccdd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.530528727 +0000 UTC m=+1.778980364,LastTimestamp:2026-03-14 08:27:46.530528727 +0000 UTC m=+1.778980364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.598545 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca7d0576ab779 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.538780537 +0000 UTC m=+1.787232254,LastTimestamp:2026-03-14 08:27:46.538780537 +0000 UTC m=+1.787232254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.604138 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d057aa8068 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.542960744 +0000 UTC m=+1.791412381,LastTimestamp:2026-03-14 08:27:46.542960744 +0000 UTC m=+1.791412381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.609255 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d057be7769 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.544269161 +0000 UTC m=+1.792720798,LastTimestamp:2026-03-14 08:27:46.544269161 +0000 UTC m=+1.792720798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.615067 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d057d1925e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.545521246 +0000 UTC m=+1.793972873,LastTimestamp:2026-03-14 08:27:46.545521246 +0000 UTC m=+1.793972873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.621225 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d0680e87aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.817951658 +0000 UTC m=+2.066403295,LastTimestamp:2026-03-14 08:27:46.817951658 +0000 UTC m=+2.066403295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.626700 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d068a257a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.827638688 +0000 UTC m=+2.076090325,LastTimestamp:2026-03-14 08:27:46.827638688 +0000 UTC m=+2.076090325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.631066 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d068b0eae7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.828593895 +0000 UTC m=+2.077045522,LastTimestamp:2026-03-14 08:27:46.828593895 +0000 UTC m=+2.077045522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.635560 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d075481101 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.039826177 +0000 UTC m=+2.288277844,LastTimestamp:2026-03-14 08:27:47.039826177 +0000 UTC m=+2.288277844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.640971 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d07629e782 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.05462669 +0000 UTC m=+2.303078367,LastTimestamp:2026-03-14 08:27:47.05462669 +0000 UTC m=+2.303078367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.646511 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d0764c02f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.056861943 +0000 UTC m=+2.305313620,LastTimestamp:2026-03-14 08:27:47.056861943 +0000 UTC m=+2.305313620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.651369 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d08313b809 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.271276553 +0000 UTC m=+2.519728200,LastTimestamp:2026-03-14 08:27:47.271276553 +0000 UTC m=+2.519728200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.656272 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d083f33afa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.285924602 +0000 UTC m=+2.534376239,LastTimestamp:2026-03-14 08:27:47.285924602 +0000 UTC m=+2.534376239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.661482 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d08d207aea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.439885034 +0000 UTC m=+2.688336671,LastTimestamp:2026-03-14 08:27:47.439885034 +0000 UTC m=+2.688336671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.667391 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca7d08d7b4c5a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.44583689 +0000 UTC m=+2.694288527,LastTimestamp:2026-03-14 08:27:47.44583689 +0000 UTC m=+2.694288527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.671758 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d08f47e42a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.476022314 +0000 UTC m=+2.724473991,LastTimestamp:2026-03-14 08:27:47.476022314 +0000 UTC m=+2.724473991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.676725 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d08f589824 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.477116964 +0000 UTC m=+2.725568601,LastTimestamp:2026-03-14 08:27:47.477116964 +0000 UTC m=+2.725568601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.680824 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d09d2d0650 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.709142608 +0000 UTC m=+2.957594255,LastTimestamp:2026-03-14 08:27:47.709142608 +0000 UTC m=+2.957594255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.685959 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca7d09d4a0d93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.711045011 +0000 UTC m=+2.959496658,LastTimestamp:2026-03-14 08:27:47.711045011 +0000 UTC m=+2.959496658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.691751 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d09d676954 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.712969044 +0000 UTC m=+2.961420681,LastTimestamp:2026-03-14 08:27:47.712969044 +0000 UTC m=+2.961420681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.697306 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ca7d09e094361 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.723576161 +0000 UTC m=+2.972027798,LastTimestamp:2026-03-14 08:27:47.723576161 +0000 UTC m=+2.972027798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.702964 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d09e8db021 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.732254753 +0000 UTC m=+2.980706390,LastTimestamp:2026-03-14 08:27:47.732254753 +0000 UTC m=+2.980706390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.708337 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d09e9cf0bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.733254331 +0000 UTC m=+2.981705968,LastTimestamp:2026-03-14 08:27:47.733254331 +0000 UTC m=+2.981705968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.714412 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d09ef3ede9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.738955241 +0000 UTC m=+2.987406878,LastTimestamp:2026-03-14 08:27:47.738955241 +0000 UTC m=+2.987406878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.719152 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0a5d61de3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.854441955 +0000 UTC m=+3.102893592,LastTimestamp:2026-03-14 08:27:47.854441955 +0000 UTC m=+3.102893592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.724073 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0a6a2bc91 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.867851921 +0000 UTC m=+3.116303558,LastTimestamp:2026-03-14 08:27:47.867851921 +0000 UTC m=+3.116303558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.728439 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0a6b46a7b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.869010555 +0000 UTC m=+3.117462192,LastTimestamp:2026-03-14 08:27:47.869010555 +0000 UTC m=+3.117462192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.734421 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0ab49115b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.945861467 +0000 UTC m=+3.194313104,LastTimestamp:2026-03-14 08:27:47.945861467 +0000 UTC m=+3.194313104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.742113 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0ac54f7c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.963418566 +0000 UTC m=+3.211870203,LastTimestamp:2026-03-14 08:27:47.963418566 +0000 UTC m=+3.211870203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.747902 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0ac6ec37c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:47.965109116 +0000 UTC m=+3.213560753,LastTimestamp:2026-03-14 08:27:47.965109116 +0000 UTC m=+3.213560753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.753403 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0b17bd0d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.04985058 +0000 UTC m=+3.298302217,LastTimestamp:2026-03-14 08:27:48.04985058 +0000 UTC m=+3.298302217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.758341 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0b2604e65 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.064824933 +0000 UTC m=+3.313276570,LastTimestamp:2026-03-14 08:27:48.064824933 +0000 UTC m=+3.313276570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.763602 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0b26ff527 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.065850663 +0000 UTC m=+3.314302300,LastTimestamp:2026-03-14 08:27:48.065850663 +0000 UTC m=+3.314302300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.768829 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0b7302b0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.145556239 +0000 UTC m=+3.394007876,LastTimestamp:2026-03-14 08:27:48.145556239 +0000 UTC m=+3.394007876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.775167 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0b81f0cfa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.161211642 +0000 UTC m=+3.409663279,LastTimestamp:2026-03-14 08:27:48.161211642 +0000 UTC m=+3.409663279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.781586 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0b84badba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.164136378 +0000 UTC m=+3.412588015,LastTimestamp:2026-03-14 08:27:48.164136378 +0000 UTC m=+3.412588015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.787889 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0bce4906c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.241264748 +0000 UTC m=+3.489716385,LastTimestamp:2026-03-14 08:27:48.241264748 +0000 UTC m=+3.489716385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.793724 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ca7d0bda6490f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.253960463 +0000 UTC m=+3.502412090,LastTimestamp:2026-03-14 08:27:48.253960463 +0000 UTC m=+3.502412090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.798329 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0c173a41a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.317750298 +0000 UTC m=+3.566201935,LastTimestamp:2026-03-14 08:27:48.317750298 +0000 UTC m=+3.566201935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.803090 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0c2134a41 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.328213057 +0000 UTC m=+3.576664714,LastTimestamp:2026-03-14 08:27:48.328213057 +0000 UTC m=+3.576664714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.808376 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0c2288ccd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.329606349 +0000 UTC m=+3.578057986,LastTimestamp:2026-03-14 08:27:48.329606349 +0000 UTC m=+3.578057986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.811726 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d0cbe34149 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.492837193 +0000 UTC m=+3.741288820,LastTimestamp:2026-03-14 08:27:48.492837193 +0000 UTC m=+3.741288820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.814183 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0cd5c5493 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.517549203 +0000 UTC m=+3.766000830,LastTimestamp:2026-03-14 08:27:48.517549203 +0000 UTC m=+3.766000830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.818175 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0ce3e7fc0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.532371392 +0000 UTC m=+3.780823029,LastTimestamp:2026-03-14 08:27:48.532371392 +0000 UTC m=+3.780823029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.823429 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d0d671447f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.669916287 +0000 UTC m=+3.918367924,LastTimestamp:2026-03-14 08:27:48.669916287 +0000 UTC m=+3.918367924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.827948 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d0d74938d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.684069075 +0000 UTC m=+3.932520712,LastTimestamp:2026-03-14 08:27:48.684069075 +0000 UTC m=+3.932520712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.833869 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d1084c3257 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.506347607 +0000 UTC m=+4.754799244,LastTimestamp:2026-03-14 08:27:49.506347607 +0000 UTC m=+4.754799244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.838440 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d111332736 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.655701302 +0000 UTC m=+4.904152939,LastTimestamp:2026-03-14 08:27:49.655701302 +0000 UTC m=+4.904152939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.842881 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d11196600a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.662203914 +0000 UTC m=+4.910655541,LastTimestamp:2026-03-14 08:27:49.662203914 +0000 UTC m=+4.910655541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.847469 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d111a8a8d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.663402199 +0000 UTC m=+4.911853856,LastTimestamp:2026-03-14 08:27:49.663402199 +0000 UTC m=+4.911853856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.851319 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d11d10aeaf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.854768815 +0000 UTC m=+5.103220442,LastTimestamp:2026-03-14 08:27:49.854768815 +0000 UTC m=+5.103220442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.855694 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d11dad404d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.865029709 +0000 UTC m=+5.113481356,LastTimestamp:2026-03-14 08:27:49.865029709 +0000 UTC m=+5.113481356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.859844 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d11dbbaafe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:49.865974526 +0000 UTC m=+5.114426173,LastTimestamp:2026-03-14 08:27:49.865974526 +0000 UTC m=+5.114426173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.863967 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d12a5f289c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.078015644 +0000 UTC m=+5.326467321,LastTimestamp:2026-03-14 08:27:50.078015644 +0000 UTC m=+5.326467321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.868432 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d12b24f27d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.090977917 +0000 UTC m=+5.339429554,LastTimestamp:2026-03-14 08:27:50.090977917 +0000 UTC m=+5.339429554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.873352 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d12b373a1e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.092175902 +0000 UTC m=+5.340627539,LastTimestamp:2026-03-14 08:27:50.092175902 +0000 UTC m=+5.340627539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.879186 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d1377fcb11 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.298258193 +0000 UTC m=+5.546709830,LastTimestamp:2026-03-14 08:27:50.298258193 +0000 UTC m=+5.546709830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.884819 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d1386d3847 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.313818183 +0000 UTC m=+5.562269850,LastTimestamp:2026-03-14 08:27:50.313818183 +0000 UTC m=+5.562269850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.890044 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d13884b210 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.315356688 +0000 UTC m=+5.563808325,LastTimestamp:2026-03-14 08:27:50.315356688 +0000 UTC m=+5.563808325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.894352 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d143d593a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.50520669 +0000 UTC m=+5.753658327,LastTimestamp:2026-03-14 08:27:50.50520669 +0000 UTC m=+5.753658327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.905144 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ca7d144fca7f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:50.524545011 +0000 UTC m=+5.772996648,LastTimestamp:2026-03-14 08:27:50.524545011 +0000 UTC m=+5.772996648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.912206 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:28:09 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca7d1fdaa295f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 14 08:28:09 crc kubenswrapper[4886]: body: Mar 14 08:28:09 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:53.622923615 +0000 UTC m=+8.871375282,LastTimestamp:2026-03-14 08:27:53.622923615 +0000 UTC m=+8.871375282,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:09 crc kubenswrapper[4886]: > Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.914578 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d1fdac2a47 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:53.623054919 +0000 UTC m=+8.871506596,LastTimestamp:2026-03-14 08:27:53.623054919 +0000 UTC m=+8.871506596,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.917199 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 08:28:09 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-apiserver-crc.189ca7d35368788e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 08:28:09 crc kubenswrapper[4886]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:28:09 crc kubenswrapper[4886]: Mar 14 08:28:09 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:59.356426382 +0000 UTC m=+14.604878039,LastTimestamp:2026-03-14 08:27:59.356426382 +0000 UTC m=+14.604878039,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:09 crc kubenswrapper[4886]: > Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.920605 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d353693370 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:59.356474224 +0000 UTC m=+14.604925871,LastTimestamp:2026-03-14 08:27:59.356474224 +0000 UTC m=+14.604925871,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.922656 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca7d35368788e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 08:28:09 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-apiserver-crc.189ca7d35368788e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 14 08:28:09 crc kubenswrapper[4886]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 14 08:28:09 crc kubenswrapper[4886]: Mar 14 08:28:09 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:59.356426382 +0000 UTC m=+14.604878039,LastTimestamp:2026-03-14 08:27:59.36278558 +0000 UTC m=+14.611237227,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:09 crc kubenswrapper[4886]: > Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.927380 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca7d353693370\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d353693370 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:59.356474224 +0000 UTC m=+14.604925871,LastTimestamp:2026-03-14 08:27:59.362824392 +0000 UTC m=+14.611276039,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.932079 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 14 08:28:09 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-apiserver-crc.189ca7d3546e18c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 14 08:28:09 crc kubenswrapper[4886]: body: Mar 14 08:28:09 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:59.373572289 +0000 UTC m=+14.622023936,LastTimestamp:2026-03-14 08:27:59.373572289 +0000 UTC m=+14.622023936,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:09 crc kubenswrapper[4886]: > Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.936487 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d3546eafb2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:59.37361093 +0000 UTC m=+14.622062577,LastTimestamp:2026-03-14 08:27:59.37361093 +0000 UTC m=+14.622062577,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.940580 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ca7d0c2288ccd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ca7d0c2288ccd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:48.329606349 +0000 UTC m=+3.578057986,LastTimestamp:2026-03-14 08:27:59.560855524 +0000 UTC m=+14.809307161,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.949527 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:28:09 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c104aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:28:09 crc kubenswrapper[4886]: body: Mar 14 08:28:09 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623642282 +0000 UTC m=+18.872094019,LastTimestamp:2026-03-14 08:28:03.623642282 +0000 UTC m=+18.872094019,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:09 crc kubenswrapper[4886]: > Mar 14 08:28:09 crc kubenswrapper[4886]: E0314 08:28:09.955162 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c29219 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623744025 +0000 UTC m=+18.872195702,LastTimestamp:2026-03-14 08:28:03.623744025 +0000 UTC m=+18.872195702,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:10 crc kubenswrapper[4886]: I0314 08:28:10.351433 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:11 crc kubenswrapper[4886]: I0314 08:28:11.353305 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:11 crc kubenswrapper[4886]: W0314 08:28:11.437593 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 14 08:28:11 crc kubenswrapper[4886]: E0314 08:28:11.437677 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:11 crc kubenswrapper[4886]: W0314 08:28:11.635329 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:11 crc kubenswrapper[4886]: E0314 08:28:11.635393 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:12 crc kubenswrapper[4886]: I0314 08:28:12.352846 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:12 crc kubenswrapper[4886]: I0314 08:28:12.769674 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:12 crc kubenswrapper[4886]: E0314 08:28:12.770062 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:12 crc kubenswrapper[4886]: I0314 08:28:12.771527 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:12 crc kubenswrapper[4886]: I0314 08:28:12.771576 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:12 crc kubenswrapper[4886]: I0314 08:28:12.771594 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:12 crc kubenswrapper[4886]: I0314 08:28:12.771622 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:12 crc kubenswrapper[4886]: E0314 08:28:12.779318 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.354497 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.622635 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.622721 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.622790 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.622967 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.624250 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.624279 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.624288 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.624730 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 14 08:28:13 crc kubenswrapper[4886]: I0314 08:28:13.624888 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9" gracePeriod=30 Mar 14 08:28:13 crc kubenswrapper[4886]: E0314 08:28:13.630550 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d451c104aa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:28:13 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c104aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:28:13 crc kubenswrapper[4886]: body: Mar 14 08:28:13 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623642282 +0000 UTC m=+18.872094019,LastTimestamp:2026-03-14 08:28:13.622700113 +0000 UTC m=+28.871151760,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:13 crc kubenswrapper[4886]: > Mar 14 08:28:13 crc kubenswrapper[4886]: E0314 08:28:13.637202 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d451c29219\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c29219 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623744025 +0000 UTC m=+18.872195702,LastTimestamp:2026-03-14 08:28:13.622755055 +0000 UTC m=+28.871206712,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:13 crc kubenswrapper[4886]: E0314 08:28:13.643045 4886 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d6a5dfbaec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:13.624875756 +0000 UTC m=+28.873327393,LastTimestamp:2026-03-14 08:28:13.624875756 +0000 UTC m=+28.873327393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:13 crc kubenswrapper[4886]: E0314 08:28:13.757654 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d056eccdd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d056eccdd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.530528727 +0000 UTC m=+1.778980364,LastTimestamp:2026-03-14 08:28:13.748905869 +0000 UTC m=+28.997357536,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:13 crc kubenswrapper[4886]: E0314 08:28:13.986105 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d0680e87aa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d0680e87aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.817951658 +0000 UTC m=+2.066403295,LastTimestamp:2026-03-14 08:28:13.977893219 +0000 UTC m=+29.226344896,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:13 crc kubenswrapper[4886]: E0314 08:28:13.997270 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d068a257a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d068a257a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:27:46.827638688 +0000 UTC m=+2.076090325,LastTimestamp:2026-03-14 08:28:13.989363127 +0000 UTC m=+29.237814774,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.349336 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.622418 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.623046 4886 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9" exitCode=255 Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.623182 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9"} Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.623265 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf"} Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.623440 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.625037 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.625086 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:14 crc kubenswrapper[4886]: I0314 08:28:14.625114 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:15 crc kubenswrapper[4886]: W0314 08:28:15.181209 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 14 08:28:15 crc kubenswrapper[4886]: E0314 08:28:15.181293 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:15 crc kubenswrapper[4886]: I0314 08:28:15.353200 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:15 crc kubenswrapper[4886]: E0314 08:28:15.483399 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:28:16 crc kubenswrapper[4886]: I0314 08:28:16.352987 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:17 crc kubenswrapper[4886]: I0314 08:28:17.352462 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:18 crc kubenswrapper[4886]: I0314 08:28:18.161294 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:28:18 crc kubenswrapper[4886]: I0314 08:28:18.161579 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:18 crc kubenswrapper[4886]: I0314 08:28:18.163948 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:18 crc kubenswrapper[4886]: I0314 08:28:18.164024 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:18 crc kubenswrapper[4886]: I0314 08:28:18.164042 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:18 crc kubenswrapper[4886]: I0314 08:28:18.353618 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:19 crc kubenswrapper[4886]: I0314 08:28:19.352963 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:19 crc kubenswrapper[4886]: E0314 08:28:19.778505 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:19 crc kubenswrapper[4886]: I0314 08:28:19.779504 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:19 crc kubenswrapper[4886]: I0314 08:28:19.780927 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:19 crc kubenswrapper[4886]: I0314 08:28:19.781007 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:19 crc kubenswrapper[4886]: I0314 08:28:19.781028 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:19 crc kubenswrapper[4886]: I0314 08:28:19.781076 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:19 crc kubenswrapper[4886]: E0314 08:28:19.786013 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:20 crc kubenswrapper[4886]: I0314 08:28:20.351525 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:20 crc kubenswrapper[4886]: I0314 08:28:20.623441 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:28:20 crc kubenswrapper[4886]: I0314 08:28:20.623778 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:20 crc kubenswrapper[4886]: I0314 08:28:20.625577 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:20 crc kubenswrapper[4886]: I0314 08:28:20.625628 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:20 crc kubenswrapper[4886]: I0314 08:28:20.625640 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:21 crc kubenswrapper[4886]: I0314 08:28:21.355061 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:22 crc kubenswrapper[4886]: I0314 08:28:22.350085 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:22 crc kubenswrapper[4886]: W0314 08:28:22.406508 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 14 08:28:22 crc kubenswrapper[4886]: E0314 08:28:22.406581 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.350693 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.420725 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.422456 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.422521 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.422540 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.423396 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.623459 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:28:23 crc kubenswrapper[4886]: I0314 08:28:23.623541 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:28:23 crc kubenswrapper[4886]: E0314 08:28:23.631197 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d451c104aa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:28:23 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c104aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:28:23 crc kubenswrapper[4886]: body: Mar 14 08:28:23 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623642282 +0000 UTC m=+18.872094019,LastTimestamp:2026-03-14 08:28:23.623514508 +0000 UTC m=+38.871966145,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:23 crc kubenswrapper[4886]: > Mar 14 08:28:23 crc kubenswrapper[4886]: E0314 08:28:23.638314 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d451c29219\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c29219 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623744025 +0000 UTC m=+18.872195702,LastTimestamp:2026-03-14 08:28:23.62356661 +0000 UTC m=+38.872018237,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.349829 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.658016 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.660763 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd"} Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.660948 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.662094 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.662155 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:24 crc kubenswrapper[4886]: I0314 08:28:24.662171 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.353217 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:25 crc kubenswrapper[4886]: E0314 08:28:25.483574 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.668957 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.669623 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.673535 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" exitCode=255 Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.673583 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd"} Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.673628 4886 scope.go:117] "RemoveContainer" containerID="4d5cbc09b3c30be4fa855390ec23df68d7438fd5a91453554f35817cedc0681a" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.674022 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.675655 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.675691 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.675708 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:25 crc kubenswrapper[4886]: I0314 08:28:25.676571 4886 scope.go:117] "RemoveContainer" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" Mar 14 08:28:25 crc kubenswrapper[4886]: E0314 08:28:25.676863 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.351443 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:26 crc kubenswrapper[4886]: W0314 08:28:26.480024 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:26 crc kubenswrapper[4886]: E0314 08:28:26.480319 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.677608 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.787115 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:26 crc kubenswrapper[4886]: E0314 08:28:26.787678 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.788295 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.788369 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.788394 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:26 crc kubenswrapper[4886]: I0314 08:28:26.788435 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:26 crc kubenswrapper[4886]: E0314 08:28:26.795324 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:27 crc kubenswrapper[4886]: I0314 08:28:27.350543 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:28 crc kubenswrapper[4886]: I0314 08:28:28.351446 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.353079 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.373034 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.373276 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.374881 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.374935 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.374946 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:29 crc kubenswrapper[4886]: I0314 08:28:29.375544 4886 scope.go:117] "RemoveContainer" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" Mar 14 08:28:29 crc kubenswrapper[4886]: E0314 08:28:29.375725 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:29 crc kubenswrapper[4886]: W0314 08:28:29.736961 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 14 08:28:29 crc kubenswrapper[4886]: E0314 08:28:29.737606 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:30 crc kubenswrapper[4886]: I0314 08:28:30.349495 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:30 crc kubenswrapper[4886]: W0314 08:28:30.893507 4886 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 14 08:28:30 crc kubenswrapper[4886]: E0314 08:28:30.893597 4886 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.352494 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.444955 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.445314 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.447073 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.447166 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.447188 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:31 crc kubenswrapper[4886]: I0314 08:28:31.447996 4886 scope.go:117] "RemoveContainer" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" Mar 14 08:28:31 crc kubenswrapper[4886]: E0314 08:28:31.448384 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:32 crc kubenswrapper[4886]: I0314 08:28:32.353193 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.353693 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.622926 4886 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.623057 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 14 08:28:33 crc kubenswrapper[4886]: E0314 08:28:33.627776 4886 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ca7d451c104aa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 14 08:28:33 crc kubenswrapper[4886]: &Event{ObjectMeta:{kube-controller-manager-crc.189ca7d451c104aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 14 08:28:33 crc kubenswrapper[4886]: body: Mar 14 08:28:33 crc kubenswrapper[4886]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:28:03.623642282 +0000 UTC m=+18.872094019,LastTimestamp:2026-03-14 08:28:33.623021882 +0000 UTC m=+48.871473569,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 14 08:28:33 crc kubenswrapper[4886]: > Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.795480 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:33 crc kubenswrapper[4886]: E0314 08:28:33.796436 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.796835 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.796890 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.796901 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.796949 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:33 crc kubenswrapper[4886]: E0314 08:28:33.804043 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.884854 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.885092 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.886712 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.886763 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:33 crc kubenswrapper[4886]: I0314 08:28:33.886781 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:34 crc kubenswrapper[4886]: I0314 08:28:34.351624 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:35 crc kubenswrapper[4886]: I0314 08:28:35.353092 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:35 crc kubenswrapper[4886]: E0314 08:28:35.483806 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:28:36 crc kubenswrapper[4886]: I0314 08:28:36.353823 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:37 crc kubenswrapper[4886]: I0314 08:28:37.350607 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:38 crc kubenswrapper[4886]: I0314 08:28:38.352158 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:39 crc kubenswrapper[4886]: I0314 08:28:39.351066 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.349704 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.631042 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.631314 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.632805 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.632868 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.632889 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.635803 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.716326 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.717335 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.717414 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.717443 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:40 crc kubenswrapper[4886]: E0314 08:28:40.803811 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.804180 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.805816 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.805886 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.805911 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:40 crc kubenswrapper[4886]: I0314 08:28:40.805963 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:40 crc kubenswrapper[4886]: E0314 08:28:40.810660 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:41 crc kubenswrapper[4886]: I0314 08:28:41.352605 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:42 crc kubenswrapper[4886]: I0314 08:28:42.350335 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:42 crc kubenswrapper[4886]: I0314 08:28:42.420190 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:42 crc kubenswrapper[4886]: I0314 08:28:42.421399 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:42 crc kubenswrapper[4886]: I0314 08:28:42.421434 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:42 crc kubenswrapper[4886]: I0314 08:28:42.421444 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:42 crc kubenswrapper[4886]: I0314 08:28:42.421996 4886 scope.go:117] "RemoveContainer" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" Mar 14 08:28:42 crc kubenswrapper[4886]: E0314 08:28:42.422178 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:43 crc kubenswrapper[4886]: I0314 08:28:43.349264 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:44 crc kubenswrapper[4886]: I0314 08:28:44.351781 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:45 crc kubenswrapper[4886]: I0314 08:28:45.351863 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:45 crc kubenswrapper[4886]: E0314 08:28:45.483900 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:28:46 crc kubenswrapper[4886]: I0314 08:28:46.349424 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:47 crc kubenswrapper[4886]: I0314 08:28:47.351857 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:47 crc kubenswrapper[4886]: E0314 08:28:47.808801 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 14 08:28:47 crc kubenswrapper[4886]: I0314 08:28:47.811075 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:47 crc kubenswrapper[4886]: I0314 08:28:47.813980 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:47 crc kubenswrapper[4886]: I0314 08:28:47.814023 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:47 crc kubenswrapper[4886]: I0314 08:28:47.814035 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:47 crc kubenswrapper[4886]: I0314 08:28:47.814067 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:47 crc kubenswrapper[4886]: E0314 08:28:47.817507 4886 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 14 08:28:48 crc kubenswrapper[4886]: I0314 08:28:48.351225 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:49 crc kubenswrapper[4886]: I0314 08:28:49.349987 4886 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 14 08:28:49 crc kubenswrapper[4886]: I0314 08:28:49.855899 4886 csr.go:261] certificate signing request csr-klt54 is approved, waiting to be issued Mar 14 08:28:49 crc kubenswrapper[4886]: I0314 08:28:49.867417 4886 csr.go:257] certificate signing request csr-klt54 is issued Mar 14 08:28:49 crc kubenswrapper[4886]: I0314 08:28:49.955400 4886 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 14 08:28:50 crc kubenswrapper[4886]: I0314 08:28:50.196929 4886 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 14 08:28:50 crc kubenswrapper[4886]: I0314 08:28:50.869244 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-17 21:57:05.307784783 +0000 UTC Mar 14 08:28:50 crc kubenswrapper[4886]: I0314 08:28:50.869293 4886 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7429h28m14.438493875s for next certificate rotation Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.420660 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.421860 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.421909 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.421926 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.422736 4886 scope.go:117] "RemoveContainer" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.754064 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.756443 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e"} Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.756629 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.757595 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.757643 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.757660 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.818186 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.819563 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.819598 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.819608 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.819716 4886 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.825877 4886 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.826232 4886 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.826272 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.829322 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.829358 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.829370 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.829388 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.829401 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:28:54Z","lastTransitionTime":"2026-03-14T08:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.839957 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.846645 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.846688 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.846700 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.846715 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.846728 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:28:54Z","lastTransitionTime":"2026-03-14T08:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.854946 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.861044 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.861083 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.861095 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.861138 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.861151 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:28:54Z","lastTransitionTime":"2026-03-14T08:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.869978 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.876507 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.876547 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.876559 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.876571 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:28:54 crc kubenswrapper[4886]: I0314 08:28:54.876579 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:28:54Z","lastTransitionTime":"2026-03-14T08:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.886849 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.887027 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.887066 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:54 crc kubenswrapper[4886]: E0314 08:28:54.987294 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.088207 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.189195 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.289435 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.390182 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.484061 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.490866 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.591932 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.692269 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.765001 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.765441 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.767047 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" exitCode=255 Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.767093 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e"} Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.767153 4886 scope.go:117] "RemoveContainer" containerID="944d29328dbd563a2eeef19be74858652a0a2bd63a3878354cc8e70448d57bfd" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.767311 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.768225 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.768244 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.768254 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:55 crc kubenswrapper[4886]: I0314 08:28:55.768798 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.768956 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.792837 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.893247 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:55 crc kubenswrapper[4886]: E0314 08:28:55.994107 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.094840 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.195531 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.296670 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: I0314 08:28:56.300615 4886 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.397734 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.498811 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.599746 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.700591 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: I0314 08:28:56.771239 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.801721 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:56 crc kubenswrapper[4886]: E0314 08:28:56.902763 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.035918 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.136924 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.237051 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.338031 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.438716 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.539273 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.639941 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.740797 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.840908 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:57 crc kubenswrapper[4886]: E0314 08:28:57.941694 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.042795 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.143523 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.243895 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.344014 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.444998 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.545520 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.646376 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.747138 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.847717 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:58 crc kubenswrapper[4886]: E0314 08:28:58.948384 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.049337 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.149918 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.250899 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.351256 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: I0314 08:28:59.373311 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:28:59 crc kubenswrapper[4886]: I0314 08:28:59.373611 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:28:59 crc kubenswrapper[4886]: I0314 08:28:59.375072 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:28:59 crc kubenswrapper[4886]: I0314 08:28:59.375112 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:28:59 crc kubenswrapper[4886]: I0314 08:28:59.375150 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:28:59 crc kubenswrapper[4886]: I0314 08:28:59.375748 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.375896 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.451556 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.552033 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.653067 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.754254 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.854753 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:28:59 crc kubenswrapper[4886]: E0314 08:28:59.954895 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.055832 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.156696 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.257452 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.357877 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.458933 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.559581 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.659704 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.759862 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.860260 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:00 crc kubenswrapper[4886]: E0314 08:29:00.960358 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.061284 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.161602 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.262471 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.362919 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: I0314 08:29:01.444145 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:29:01 crc kubenswrapper[4886]: I0314 08:29:01.444319 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:29:01 crc kubenswrapper[4886]: I0314 08:29:01.445369 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:01 crc kubenswrapper[4886]: I0314 08:29:01.445417 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:01 crc kubenswrapper[4886]: I0314 08:29:01.445429 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:01 crc kubenswrapper[4886]: I0314 08:29:01.446058 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.446248 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.463856 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.564666 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.665370 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.766259 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.866606 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:01 crc kubenswrapper[4886]: E0314 08:29:01.967545 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.068685 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.169454 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.270170 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.370533 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: I0314 08:29:02.420699 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:29:02 crc kubenswrapper[4886]: I0314 08:29:02.422548 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:02 crc kubenswrapper[4886]: I0314 08:29:02.422618 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:02 crc kubenswrapper[4886]: I0314 08:29:02.422638 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.471319 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.571725 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.672557 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.773092 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.873415 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:02 crc kubenswrapper[4886]: E0314 08:29:02.974353 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.075338 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.175540 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.276277 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.376842 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.477467 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.577956 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.679088 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.780180 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.880313 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:03 crc kubenswrapper[4886]: E0314 08:29:03.980521 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.081650 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.182163 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.282966 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.383098 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.483250 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.583784 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.685015 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.785897 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.887027 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:04 crc kubenswrapper[4886]: E0314 08:29:04.988084 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.083346 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.086743 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.086817 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.086834 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.086867 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.086886 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:05Z","lastTransitionTime":"2026-03-14T08:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.096723 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.100450 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.100483 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.100496 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.100510 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.100521 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:05Z","lastTransitionTime":"2026-03-14T08:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.110178 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.113177 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.113209 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.113221 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.113236 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.113246 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:05Z","lastTransitionTime":"2026-03-14T08:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.122582 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.126425 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.126463 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.126472 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.126497 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:05 crc kubenswrapper[4886]: I0314 08:29:05.126507 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:05Z","lastTransitionTime":"2026-03-14T08:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.136612 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.136772 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.136804 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.237866 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.338339 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.439290 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.484193 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.539898 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.640890 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.741325 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.842354 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:05 crc kubenswrapper[4886]: E0314 08:29:05.942848 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.043780 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.144787 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.245480 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.346160 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.447038 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.547563 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.648453 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.748978 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.849697 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:06 crc kubenswrapper[4886]: E0314 08:29:06.950782 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.051678 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.151867 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.252404 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.353402 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.454540 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.554902 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.655910 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.756367 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.856697 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:07 crc kubenswrapper[4886]: E0314 08:29:07.957691 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.058060 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.158847 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.259713 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.360693 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.461773 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.561901 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.662415 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.762887 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.863602 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:08 crc kubenswrapper[4886]: E0314 08:29:08.964719 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.065671 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.166729 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.267271 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.367895 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.468749 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.568961 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.669704 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.770646 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.871078 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:09 crc kubenswrapper[4886]: E0314 08:29:09.972207 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.072712 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.173364 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.273913 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.375022 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.475605 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.576614 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.676817 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.776987 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.877890 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: E0314 08:29:10.978327 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:10 crc kubenswrapper[4886]: I0314 08:29:10.979163 4886 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.078918 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.179868 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.281343 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.382530 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.484115 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.585000 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.685890 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.786587 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.886744 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:11 crc kubenswrapper[4886]: E0314 08:29:11.987689 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.088625 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.189774 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.290155 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.390696 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: I0314 08:29:12.420854 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:29:12 crc kubenswrapper[4886]: I0314 08:29:12.422870 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:12 crc kubenswrapper[4886]: I0314 08:29:12.422922 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:12 crc kubenswrapper[4886]: I0314 08:29:12.422940 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:12 crc kubenswrapper[4886]: I0314 08:29:12.423960 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.424385 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.491377 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.591869 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.692671 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.793482 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.894177 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:12 crc kubenswrapper[4886]: E0314 08:29:12.995370 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.096495 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.197029 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.297775 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.397919 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: I0314 08:29:13.420522 4886 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 14 08:29:13 crc kubenswrapper[4886]: I0314 08:29:13.422687 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:13 crc kubenswrapper[4886]: I0314 08:29:13.422746 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:13 crc kubenswrapper[4886]: I0314 08:29:13.422759 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.498789 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.599830 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.700653 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.801664 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:13 crc kubenswrapper[4886]: E0314 08:29:13.902373 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.003371 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.104355 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.205373 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.306638 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.407688 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.508196 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.608741 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.709041 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.809632 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:14 crc kubenswrapper[4886]: E0314 08:29:14.910518 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.010813 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.111807 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.212550 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.312765 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.336387 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.343011 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.343071 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.343093 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.343160 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.343189 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:15Z","lastTransitionTime":"2026-03-14T08:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.360674 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.367272 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.367323 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.367336 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.367357 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.367392 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:15Z","lastTransitionTime":"2026-03-14T08:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.389029 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.398732 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.398788 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.398810 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.398840 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.398863 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:15Z","lastTransitionTime":"2026-03-14T08:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.416432 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.422198 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.422269 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.422291 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.422316 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:15 crc kubenswrapper[4886]: I0314 08:29:15.422335 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:15Z","lastTransitionTime":"2026-03-14T08:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.439444 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.439697 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.439744 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.484318 4886 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.540920 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.641182 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.742086 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.842660 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:15 crc kubenswrapper[4886]: E0314 08:29:15.942990 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.043786 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.144899 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.245368 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.346383 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.447101 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.547481 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: E0314 08:29:16.647698 4886 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.653829 4886 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.750156 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.750235 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.750262 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.750290 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.750309 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:16Z","lastTransitionTime":"2026-03-14T08:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.853771 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.853827 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.853849 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.853881 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.853903 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:16Z","lastTransitionTime":"2026-03-14T08:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.957230 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.957309 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.957330 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.957359 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:16 crc kubenswrapper[4886]: I0314 08:29:16.957393 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:16Z","lastTransitionTime":"2026-03-14T08:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.060305 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.060367 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.060386 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.060409 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.060428 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.163486 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.163543 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.163560 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.163584 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.163601 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.266540 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.266612 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.266629 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.266654 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.266672 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.369912 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.369964 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.369981 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.370004 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.370022 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.388462 4886 apiserver.go:52] "Watching apiserver" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.397283 4886 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.397669 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.398113 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.398276 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.398436 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.398515 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.398656 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.399190 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.398952 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.399285 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.399416 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.402659 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.402706 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.402963 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.403004 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.403545 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.403618 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.404072 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.404235 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.406703 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.457802 4886 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.458329 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.477552 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.477609 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.477624 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.477652 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.477670 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.498673 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.515742 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.525589 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.535519 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.545196 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.556518 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557047 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557089 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557549 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557145 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557661 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557654 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557689 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557781 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557811 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557835 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557863 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557891 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557922 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557950 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557969 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.557994 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558024 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558044 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558065 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558086 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558106 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558147 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558171 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558196 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558169 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558217 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558258 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558292 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558323 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558343 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558363 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558383 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558380 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558401 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558475 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558498 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558500 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558519 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558561 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558560 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558584 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558674 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558695 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558718 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558790 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558805 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558804 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558925 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559161 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559360 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559360 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559418 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559585 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559617 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559675 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559819 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559859 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559879 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559899 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.558805 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559941 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559961 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.559978 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560020 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560027 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560038 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560036 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560090 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560114 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560148 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560165 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560182 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560199 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560215 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560232 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560219 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560250 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560256 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560269 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560288 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560305 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560322 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560340 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560359 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560378 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560395 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560414 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560433 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560450 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560461 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560467 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560494 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560513 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560530 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560540 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560548 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560564 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560582 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560599 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560615 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560631 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560649 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560645 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560664 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560685 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560701 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560717 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560732 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560751 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560767 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560785 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560801 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560818 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560836 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560853 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560851 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560873 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560893 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560909 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560925 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560940 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560957 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560974 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.560990 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561008 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561025 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561041 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561061 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561077 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561093 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561110 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561145 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561160 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561176 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561191 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561212 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561227 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561244 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561245 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561261 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561277 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561294 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561311 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561328 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561345 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561363 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561381 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561396 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561413 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561417 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561428 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561512 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561540 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561639 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561718 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561794 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561829 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561830 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561887 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561915 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561941 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561964 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.561983 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562006 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562030 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562055 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562075 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562098 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562144 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562169 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562197 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562220 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562243 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562264 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562290 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562314 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562338 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562363 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562384 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562407 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562439 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562461 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562485 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562506 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562528 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562553 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562581 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562606 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562627 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562653 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562676 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562701 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562723 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562747 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562770 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562794 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562822 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562846 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562874 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562898 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562923 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562948 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562975 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562999 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563023 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563061 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563086 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563109 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563239 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563268 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563292 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563316 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563344 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563374 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563391 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563409 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563429 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563447 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563464 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563487 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563503 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563520 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563537 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563556 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563573 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563588 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563606 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563623 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563671 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563689 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563706 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563723 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563739 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563756 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563777 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563820 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563845 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563863 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563886 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563906 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563926 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563943 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563961 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563985 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564007 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564028 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564050 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564080 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564099 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564245 4886 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564258 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564268 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564279 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564289 4886 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564299 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564308 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564318 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564328 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564337 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564348 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564357 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564367 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564376 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564387 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564398 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564409 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564422 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564440 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564454 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564464 4886 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564474 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564484 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564494 4886 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564503 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564512 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564522 4886 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564532 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564543 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564552 4886 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564561 4886 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564571 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564581 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564597 4886 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564608 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564618 4886 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564627 4886 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564636 4886 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564647 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564657 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564666 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564675 4886 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564685 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562029 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562037 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562891 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.562965 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563110 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563353 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.563485 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564077 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564353 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564370 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564274 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564713 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564694 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564720 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571696 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.564741 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564944 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.564977 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565028 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565160 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565533 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565650 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565681 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565829 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.565914 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566153 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566206 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566413 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566405 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566581 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566611 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566713 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.566785 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.567484 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.567611 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.567775 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.567784 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568208 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568258 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568484 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568707 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568703 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568727 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.568790 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.569230 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.569382 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.569427 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.570171 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.570196 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.570448 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.570562 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.570845 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571055 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571100 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571340 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571492 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571534 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571785 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571699 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.571987 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.570885 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.572458 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:18.072430963 +0000 UTC m=+93.320882610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.572977 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573367 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573617 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573828 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573841 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573954 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574156 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574470 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574550 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574637 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574739 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574789 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574814 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.574983 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.575189 4886 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.575495 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.575854 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.576034 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.576064 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.576159 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:18.076110925 +0000 UTC m=+93.324562562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.576251 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.576522 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.576658 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573757 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.573880 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.577253 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:18.077212545 +0000 UTC m=+93.325664222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.577410 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.577841 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.578596 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.578618 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.578878 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.578991 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.579478 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.579800 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.580107 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.580357 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.580509 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.580897 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.582108 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.582417 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.582670 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.583683 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.583845 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.584104 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.584173 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.584204 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.584347 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:18.084322701 +0000 UTC m=+93.332774378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.584202 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.584572 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.584615 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.584635 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.584665 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.584690 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.585440 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.585708 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.586255 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.586448 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.586638 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.586741 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.587532 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.588083 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.588090 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.588186 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.588200 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.588277 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:18.088264439 +0000 UTC m=+93.336716076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.588426 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.588788 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.591984 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.592001 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.596209 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.596490 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.596605 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.596728 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.596916 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.596871 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.597235 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.597166 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.597650 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.597676 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.598286 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.598534 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.598604 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.598714 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.598833 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.599768 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600186 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600364 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600301 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600637 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600673 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600777 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.601193 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.601733 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.602500 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.602719 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.602826 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.602932 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.600727 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.603869 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.603877 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.603432 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.603476 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.603943 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.603085 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.604088 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.604227 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.605223 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.605267 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.605501 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.605636 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.605763 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.607397 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.619172 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.625551 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.627564 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.636847 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.641964 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666078 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666226 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666238 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666466 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666472 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666715 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666792 4886 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666868 4886 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.666941 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667015 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667091 4886 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667201 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667282 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667352 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667419 4886 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667493 4886 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667567 4886 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667690 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667767 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667837 4886 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667910 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.667984 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668174 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668268 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668342 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668416 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668490 4886 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668566 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668646 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668715 4886 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668787 4886 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668856 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.668930 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669007 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669082 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669183 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669261 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669333 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669401 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669480 4886 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669560 4886 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669637 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669707 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669779 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669859 4886 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.669936 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670009 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670087 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670186 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670259 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670333 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670444 4886 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670516 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670589 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670661 4886 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670737 4886 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670809 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670878 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.670950 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671026 4886 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671094 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671206 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671280 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671348 4886 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671427 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671496 4886 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671562 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671681 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671771 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671852 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671923 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.671992 4886 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672067 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672162 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672251 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672322 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672396 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672468 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672552 4886 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672632 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672703 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672777 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672849 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672918 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.672991 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673065 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673158 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673231 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673309 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673387 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673458 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673526 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673599 4886 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673669 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673735 4886 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673811 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673888 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.673958 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674029 4886 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674101 4886 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674200 4886 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674281 4886 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674356 4886 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674436 4886 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674510 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674579 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674652 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674720 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674795 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674865 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.674941 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675013 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675088 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675180 4886 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675267 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675341 4886 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675419 4886 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675492 4886 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675569 4886 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675671 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675749 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675824 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675899 4886 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.675978 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676048 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676148 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676234 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676309 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676382 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676466 4886 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676535 4886 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676608 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676676 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676749 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676821 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676893 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.676961 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677028 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677100 4886 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677189 4886 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677272 4886 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677348 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677429 4886 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677505 4886 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677573 4886 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677647 4886 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677725 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677792 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677865 4886 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.677937 4886 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.678012 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.678086 4886 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.678188 4886 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.678261 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.678328 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.687621 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.687683 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.687703 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.687729 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.687748 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.715861 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.724471 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.733947 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:29:17 crc kubenswrapper[4886]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 14 08:29:17 crc kubenswrapper[4886]: set -o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 14 08:29:17 crc kubenswrapper[4886]: source /etc/kubernetes/apiserver-url.env Mar 14 08:29:17 crc kubenswrapper[4886]: else Mar 14 08:29:17 crc kubenswrapper[4886]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 14 08:29:17 crc kubenswrapper[4886]: exit 1 Mar 14 08:29:17 crc kubenswrapper[4886]: fi Mar 14 08:29:17 crc kubenswrapper[4886]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 14 08:29:17 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 14 08:29:17 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.735184 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.739645 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.741588 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:29:17 crc kubenswrapper[4886]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 14 08:29:17 crc kubenswrapper[4886]: if [[ -f "/env/_master" ]]; then Mar 14 08:29:17 crc kubenswrapper[4886]: set -o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: source "/env/_master" Mar 14 08:29:17 crc kubenswrapper[4886]: set +o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: fi Mar 14 08:29:17 crc kubenswrapper[4886]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 14 08:29:17 crc kubenswrapper[4886]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 14 08:29:17 crc kubenswrapper[4886]: ho_enable="--enable-hybrid-overlay" Mar 14 08:29:17 crc kubenswrapper[4886]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 14 08:29:17 crc kubenswrapper[4886]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 14 08:29:17 crc kubenswrapper[4886]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 14 08:29:17 crc kubenswrapper[4886]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 14 08:29:17 crc kubenswrapper[4886]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --webhook-host=127.0.0.1 \ Mar 14 08:29:17 crc kubenswrapper[4886]: --webhook-port=9743 \ Mar 14 08:29:17 crc kubenswrapper[4886]: ${ho_enable} \ Mar 14 08:29:17 crc kubenswrapper[4886]: --enable-interconnect \ Mar 14 08:29:17 crc kubenswrapper[4886]: --disable-approver \ Mar 14 08:29:17 crc kubenswrapper[4886]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --wait-for-kubernetes-api=200s \ Mar 14 08:29:17 crc kubenswrapper[4886]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --loglevel="${LOGLEVEL}" Mar 14 08:29:17 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 14 08:29:17 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.746477 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:29:17 crc kubenswrapper[4886]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 14 08:29:17 crc kubenswrapper[4886]: if [[ -f "/env/_master" ]]; then Mar 14 08:29:17 crc kubenswrapper[4886]: set -o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: source "/env/_master" Mar 14 08:29:17 crc kubenswrapper[4886]: set +o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: fi Mar 14 08:29:17 crc kubenswrapper[4886]: Mar 14 08:29:17 crc kubenswrapper[4886]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 14 08:29:17 crc kubenswrapper[4886]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 14 08:29:17 crc kubenswrapper[4886]: --disable-webhook \ Mar 14 08:29:17 crc kubenswrapper[4886]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --loglevel="${LOGLEVEL}" Mar 14 08:29:17 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 14 08:29:17 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.747642 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 14 08:29:17 crc kubenswrapper[4886]: W0314 08:29:17.754251 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5a541b2005ff24d2e4362e1e5c5358bf41b550a42e125d3e854ff0585a712c62 WatchSource:0}: Error finding container 5a541b2005ff24d2e4362e1e5c5358bf41b550a42e125d3e854ff0585a712c62: Status 404 returned error can't find the container with id 5a541b2005ff24d2e4362e1e5c5358bf41b550a42e125d3e854ff0585a712c62 Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.757392 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.758562 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.790293 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.790335 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.790347 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.790365 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.790379 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.831980 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5a541b2005ff24d2e4362e1e5c5358bf41b550a42e125d3e854ff0585a712c62"} Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.835213 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.837571 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.838148 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0fc342ae33a7fe5df1c46a64658d8c6e663d2c092e4ce3853155696690b296b0"} Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.839656 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:29:17 crc kubenswrapper[4886]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 14 08:29:17 crc kubenswrapper[4886]: if [[ -f "/env/_master" ]]; then Mar 14 08:29:17 crc kubenswrapper[4886]: set -o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: source "/env/_master" Mar 14 08:29:17 crc kubenswrapper[4886]: set +o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: fi Mar 14 08:29:17 crc kubenswrapper[4886]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 14 08:29:17 crc kubenswrapper[4886]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 14 08:29:17 crc kubenswrapper[4886]: ho_enable="--enable-hybrid-overlay" Mar 14 08:29:17 crc kubenswrapper[4886]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 14 08:29:17 crc kubenswrapper[4886]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 14 08:29:17 crc kubenswrapper[4886]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 14 08:29:17 crc kubenswrapper[4886]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 14 08:29:17 crc kubenswrapper[4886]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --webhook-host=127.0.0.1 \ Mar 14 08:29:17 crc kubenswrapper[4886]: --webhook-port=9743 \ Mar 14 08:29:17 crc kubenswrapper[4886]: ${ho_enable} \ Mar 14 08:29:17 crc kubenswrapper[4886]: --enable-interconnect \ Mar 14 08:29:17 crc kubenswrapper[4886]: --disable-approver \ Mar 14 08:29:17 crc kubenswrapper[4886]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --wait-for-kubernetes-api=200s \ Mar 14 08:29:17 crc kubenswrapper[4886]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --loglevel="${LOGLEVEL}" Mar 14 08:29:17 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 14 08:29:17 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.840746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"415462104eec330c0cb131f3f316e6638ef4d6e82f7a8ae88929ea2f784435ab"} Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.842316 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:29:17 crc kubenswrapper[4886]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 14 08:29:17 crc kubenswrapper[4886]: set -o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 14 08:29:17 crc kubenswrapper[4886]: source /etc/kubernetes/apiserver-url.env Mar 14 08:29:17 crc kubenswrapper[4886]: else Mar 14 08:29:17 crc kubenswrapper[4886]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 14 08:29:17 crc kubenswrapper[4886]: exit 1 Mar 14 08:29:17 crc kubenswrapper[4886]: fi Mar 14 08:29:17 crc kubenswrapper[4886]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 14 08:29:17 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 14 08:29:17 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.842335 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:29:17 crc kubenswrapper[4886]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 14 08:29:17 crc kubenswrapper[4886]: if [[ -f "/env/_master" ]]; then Mar 14 08:29:17 crc kubenswrapper[4886]: set -o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: source "/env/_master" Mar 14 08:29:17 crc kubenswrapper[4886]: set +o allexport Mar 14 08:29:17 crc kubenswrapper[4886]: fi Mar 14 08:29:17 crc kubenswrapper[4886]: Mar 14 08:29:17 crc kubenswrapper[4886]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 14 08:29:17 crc kubenswrapper[4886]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 14 08:29:17 crc kubenswrapper[4886]: --disable-webhook \ Mar 14 08:29:17 crc kubenswrapper[4886]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 14 08:29:17 crc kubenswrapper[4886]: --loglevel="${LOGLEVEL}" Mar 14 08:29:17 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 14 08:29:17 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.843394 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 14 08:29:17 crc kubenswrapper[4886]: E0314 08:29:17.843427 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.848516 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.857238 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.867929 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.877188 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.891824 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.892827 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.892879 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.892892 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.892909 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.892921 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.902581 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.911649 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.922879 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.933981 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.945780 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.956493 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.969431 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.995625 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.995682 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.995702 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.995731 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:17 crc kubenswrapper[4886]: I0314 08:29:17.995752 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:17Z","lastTransitionTime":"2026-03-14T08:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.080234 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.080343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.080397 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:19.080369002 +0000 UTC m=+94.328820639 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.080446 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.080489 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.080592 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.080628 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:19.080605419 +0000 UTC m=+94.329057096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.081182 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:19.081160264 +0000 UTC m=+94.329611941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.098957 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.098997 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.099008 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.099026 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.099035 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.181606 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.181736 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.181866 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.181919 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.181935 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.181888 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.182002 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:19.181978623 +0000 UTC m=+94.430430460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.182005 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.182020 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:18 crc kubenswrapper[4886]: E0314 08:29:18.182054 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:19.182043015 +0000 UTC m=+94.430494862 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.201146 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.201207 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.201223 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.201241 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.201521 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.304902 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.304970 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.304988 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.305013 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.305034 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.407858 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.407897 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.407921 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.407941 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.407950 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.510615 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.510668 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.510679 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.510696 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.510706 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.613690 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.613742 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.613754 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.613770 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.613783 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.715983 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.716009 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.716016 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.716029 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.716037 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.818790 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.818835 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.818846 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.818862 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.818876 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.921518 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.921561 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.921572 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.921587 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:18 crc kubenswrapper[4886]: I0314 08:29:18.921599 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:18Z","lastTransitionTime":"2026-03-14T08:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.023836 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.023880 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.023894 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.023912 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.023924 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.089394 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.089482 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.089507 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.089540 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:21.089511104 +0000 UTC m=+96.337962791 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.089570 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.089619 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:21.089604607 +0000 UTC m=+96.338056234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.089712 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.089828 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:21.089800042 +0000 UTC m=+96.338251699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.128287 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.128425 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.128455 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.128483 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.128503 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.190388 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.190477 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190701 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190737 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190729 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190809 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190760 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190855 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.190967 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:21.19093749 +0000 UTC m=+96.439389167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.191000 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:21.190985841 +0000 UTC m=+96.439437508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.231475 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.231560 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.231589 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.231622 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.231647 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.335674 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.335771 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.335788 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.335817 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.335839 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.420662 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.420695 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.420915 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.421099 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.421296 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:19 crc kubenswrapper[4886]: E0314 08:29:19.421472 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.425569 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.426617 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.428952 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.430202 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.432077 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.433345 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.434981 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.436223 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.437542 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.438617 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.438895 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.438958 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.438972 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.438992 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.439047 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.439684 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.441274 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.443227 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.444766 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.445985 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.447879 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.449166 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.450681 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.451776 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.453814 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.455562 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.456545 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.457403 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.459281 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.460583 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.463019 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.465396 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.466378 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.467281 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.468093 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.468782 4886 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.468931 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.473706 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.474461 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.475095 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.477215 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.478571 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.479285 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.480598 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.481495 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.482531 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.483328 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.484590 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.485799 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.486430 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.487563 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.488390 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.489876 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.490533 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.491239 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.492293 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.493054 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.494464 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.495081 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.542264 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.542347 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.542373 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.542405 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.542429 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.619912 4886 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.645699 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.645785 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.645803 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.645824 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.645836 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.749314 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.749350 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.749359 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.749373 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.749382 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.851929 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.851976 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.851986 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.852006 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.852018 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.954848 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.954921 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.954944 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.954975 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:19 crc kubenswrapper[4886]: I0314 08:29:19.954993 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:19Z","lastTransitionTime":"2026-03-14T08:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.057177 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.057225 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.057235 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.057251 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.057260 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.159477 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.159559 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.159585 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.159616 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.159639 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.262439 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.262525 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.262544 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.262570 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.262589 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.366975 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.367056 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.367078 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.367105 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.367154 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.470607 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.470677 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.470694 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.470722 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.470740 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.573583 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.573665 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.573683 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.573709 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.573727 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.677693 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.677771 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.677792 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.677822 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.677844 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.780629 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.780684 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.780696 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.780720 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.780732 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.883598 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.883658 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.883671 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.883691 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.883702 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.985928 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.985982 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.985993 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.986011 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:20 crc kubenswrapper[4886]: I0314 08:29:20.986038 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:20Z","lastTransitionTime":"2026-03-14T08:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.088188 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.088228 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.088237 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.088251 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.088261 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.109644 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.109754 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:25.109736123 +0000 UTC m=+100.358187760 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.109812 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.109835 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.109901 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.109928 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:25.109922579 +0000 UTC m=+100.358374216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.110019 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.110081 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:25.110063312 +0000 UTC m=+100.358514969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.190920 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.190997 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.191016 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.191040 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.191060 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.210534 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.210636 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210741 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210795 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210795 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210817 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210831 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210851 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210898 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:25.210872581 +0000 UTC m=+100.459324298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.210926 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:25.210915712 +0000 UTC m=+100.459367359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.293816 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.293867 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.293908 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.293929 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.293944 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.396112 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.396204 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.396222 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.396246 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.396263 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.420347 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.420489 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.420721 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.420724 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.421024 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:21 crc kubenswrapper[4886]: E0314 08:29:21.421212 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.499512 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.499600 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.499625 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.499650 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.499668 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.603263 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.603305 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.603316 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.603329 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.603338 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.709354 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.709419 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.709438 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.709463 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.709490 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.812556 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.812628 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.812653 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.812678 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.812698 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.915287 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.915350 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.915368 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.915393 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:21 crc kubenswrapper[4886]: I0314 08:29:21.915413 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:21Z","lastTransitionTime":"2026-03-14T08:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.018208 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.018264 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.018281 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.018305 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.018322 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.120963 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.121015 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.121027 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.121045 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.121058 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.224468 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.224527 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.224545 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.224570 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.224593 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.327557 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.327639 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.327664 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.327696 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.327717 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.430879 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.430961 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.430973 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.431012 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.431033 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.533233 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.533334 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.533353 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.533377 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.533396 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.635308 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.635361 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.635373 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.635390 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.635401 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.738358 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.738421 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.738440 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.738464 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.738481 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.841956 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.842016 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.842033 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.842055 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.842073 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.945596 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.945676 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.945700 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.945733 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:22 crc kubenswrapper[4886]: I0314 08:29:22.945759 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:22Z","lastTransitionTime":"2026-03-14T08:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.049546 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.049628 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.049652 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.049684 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.049709 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.152497 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.152555 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.152572 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.152602 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.152620 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.255621 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.255720 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.255740 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.255802 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.255820 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.359302 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.359402 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.359422 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.359494 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.359513 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.420443 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:23 crc kubenswrapper[4886]: E0314 08:29:23.420599 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.420739 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:23 crc kubenswrapper[4886]: E0314 08:29:23.421107 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.421225 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:23 crc kubenswrapper[4886]: E0314 08:29:23.421484 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.463021 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.463109 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.463167 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.463202 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.463219 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.567218 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.567284 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.567301 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.567331 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.567353 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.669586 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.669633 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.669645 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.669667 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.669682 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.772146 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.772210 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.772228 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.772254 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.772273 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.874750 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.874799 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.874814 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.874829 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.874845 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.978276 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.978338 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.978356 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.978380 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:23 crc kubenswrapper[4886]: I0314 08:29:23.978398 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:23Z","lastTransitionTime":"2026-03-14T08:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.081340 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.081445 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.081472 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.081500 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.081519 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.184076 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.184164 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.184183 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.184210 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.184228 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.287558 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.287628 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.287652 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.287685 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.287708 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.390839 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.390919 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.390945 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.390976 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.391001 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.494537 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.494622 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.494644 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.494675 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.494701 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.598266 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.598351 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.598372 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.598405 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.599008 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.702320 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.702407 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.702427 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.702647 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.702675 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.805367 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.805434 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.805447 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.805488 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.805501 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.907212 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.907288 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.907297 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.907310 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:24 crc kubenswrapper[4886]: I0314 08:29:24.907318 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:24Z","lastTransitionTime":"2026-03-14T08:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.009412 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.009452 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.009462 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.009476 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.009484 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.111821 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.111865 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.111877 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.111895 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.111909 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.145769 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.145924 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.145966 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:33.145939904 +0000 UTC m=+108.394391551 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.146012 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.146144 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.146113 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.146190 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:33.146181071 +0000 UTC m=+108.394632718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.146243 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:33.146211832 +0000 UTC m=+108.394663519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.214977 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.215044 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.215068 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.215096 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.215147 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.247017 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.247110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247269 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247310 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247329 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247376 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247333 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247396 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247481 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:33.247458262 +0000 UTC m=+108.495909929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.247508 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:33.247495793 +0000 UTC m=+108.495947460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.318270 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.318314 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.318322 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.318339 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.318349 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.368566 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tzzd5"] Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.369034 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.371078 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.371159 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.371826 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.394453 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.406388 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.416062 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.420480 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.420536 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.420490 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.420646 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.420752 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.420846 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.421395 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.421574 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.421691 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.421816 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.421906 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.430618 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.443554 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.447971 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b55610ab-0fec-46ef-8233-8b0825013fb9-hosts-file\") pod \"node-resolver-tzzd5\" (UID: \"b55610ab-0fec-46ef-8233-8b0825013fb9\") " pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.448026 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szw6s\" (UniqueName: \"kubernetes.io/projected/b55610ab-0fec-46ef-8233-8b0825013fb9-kube-api-access-szw6s\") pod \"node-resolver-tzzd5\" (UID: \"b55610ab-0fec-46ef-8233-8b0825013fb9\") " pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.458100 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.468628 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.478381 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.493703 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.507555 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.517520 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.525690 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.525741 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.525755 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.525772 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.525791 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.529291 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.541052 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.549015 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b55610ab-0fec-46ef-8233-8b0825013fb9-hosts-file\") pod \"node-resolver-tzzd5\" (UID: \"b55610ab-0fec-46ef-8233-8b0825013fb9\") " pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.549070 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szw6s\" (UniqueName: \"kubernetes.io/projected/b55610ab-0fec-46ef-8233-8b0825013fb9-kube-api-access-szw6s\") pod \"node-resolver-tzzd5\" (UID: \"b55610ab-0fec-46ef-8233-8b0825013fb9\") " pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.549370 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b55610ab-0fec-46ef-8233-8b0825013fb9-hosts-file\") pod \"node-resolver-tzzd5\" (UID: \"b55610ab-0fec-46ef-8233-8b0825013fb9\") " pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.551775 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.566393 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szw6s\" (UniqueName: \"kubernetes.io/projected/b55610ab-0fec-46ef-8233-8b0825013fb9-kube-api-access-szw6s\") pod \"node-resolver-tzzd5\" (UID: \"b55610ab-0fec-46ef-8233-8b0825013fb9\") " pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.628464 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.628531 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.628551 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.628577 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.628596 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.686276 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.686341 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.686359 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.686386 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.686405 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.692481 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tzzd5" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.698644 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.703032 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.703085 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.703104 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.703159 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.703180 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: W0314 08:29:25.713321 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55610ab_0fec_46ef_8233_8b0825013fb9.slice/crio-0b9233399878b741e88f336a59b38180d8920c22e7906dd961dbfbc7137ecf29 WatchSource:0}: Error finding container 0b9233399878b741e88f336a59b38180d8920c22e7906dd961dbfbc7137ecf29: Status 404 returned error can't find the container with id 0b9233399878b741e88f336a59b38180d8920c22e7906dd961dbfbc7137ecf29 Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.722009 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.727285 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.727530 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.727662 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.727823 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.727969 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.733436 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ddctv"] Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.733908 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5jrmb"] Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.734163 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.734428 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dl247"] Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.734257 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.735646 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.737969 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.738186 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.738377 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.738537 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.738668 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.738806 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.738946 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.739636 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.741454 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.741944 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.742311 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.745580 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.746525 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.750495 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.750558 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.750577 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.750603 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.750632 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.750823 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.763492 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.763790 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.767069 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.767172 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.767199 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.767229 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.767254 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.774812 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.779007 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: E0314 08:29:25.779247 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.781068 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.781385 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.781417 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.781451 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.781468 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.783845 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.795000 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.808933 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.817887 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.825076 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.836939 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.844822 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851025 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851371 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-system-cni-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-multus-certs\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851431 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9532c0a-d4bd-4454-b521-bf157bf3707c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851454 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64517238-bfef-43e1-b543-1eea5b7f9c79-proxy-tls\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851477 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sntl\" (UniqueName: \"kubernetes.io/projected/b9532c0a-d4bd-4454-b521-bf157bf3707c-kube-api-access-4sntl\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851510 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ed47238-6d20-4920-9162-695e6ddcb090-multus-daemon-config\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851531 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-cnibin\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851554 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-cni-multus\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851574 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-kubelet\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851597 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-os-release\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851619 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-k8s-cni-cncf-io\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851643 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9532c0a-d4bd-4454-b521-bf157bf3707c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851675 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-system-cni-dir\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.851697 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852029 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-socket-dir-parent\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852098 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-etc-kubernetes\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852143 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8x7\" (UniqueName: \"kubernetes.io/projected/7ed47238-6d20-4920-9162-695e6ddcb090-kube-api-access-lg8x7\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852167 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-cni-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852195 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-os-release\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852221 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-conf-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852257 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-netns\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852278 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdn87\" (UniqueName: \"kubernetes.io/projected/64517238-bfef-43e1-b543-1eea5b7f9c79-kube-api-access-qdn87\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852302 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-cnibin\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852323 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-cni-bin\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852344 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-hostroot\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852363 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/64517238-bfef-43e1-b543-1eea5b7f9c79-rootfs\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852383 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64517238-bfef-43e1-b543-1eea5b7f9c79-mcd-auth-proxy-config\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.852405 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ed47238-6d20-4920-9162-695e6ddcb090-cni-binary-copy\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.857435 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.862616 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tzzd5" event={"ID":"b55610ab-0fec-46ef-8233-8b0825013fb9","Type":"ContainerStarted","Data":"0b9233399878b741e88f336a59b38180d8920c22e7906dd961dbfbc7137ecf29"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.867505 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.875539 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.882647 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.884700 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.884737 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.884749 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.884766 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.884777 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.890377 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.901624 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.910276 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953012 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdn87\" (UniqueName: \"kubernetes.io/projected/64517238-bfef-43e1-b543-1eea5b7f9c79-kube-api-access-qdn87\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953048 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-hostroot\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/64517238-bfef-43e1-b543-1eea5b7f9c79-rootfs\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953084 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64517238-bfef-43e1-b543-1eea5b7f9c79-mcd-auth-proxy-config\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953101 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-cnibin\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953146 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-cni-bin\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953161 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ed47238-6d20-4920-9162-695e6ddcb090-cni-binary-copy\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953185 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-multus-certs\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953199 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9532c0a-d4bd-4454-b521-bf157bf3707c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953213 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-system-cni-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953203 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-hostroot\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953291 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-multus-certs\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953812 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-cni-bin\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953827 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9532c0a-d4bd-4454-b521-bf157bf3707c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953847 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/64517238-bfef-43e1-b543-1eea5b7f9c79-rootfs\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ed47238-6d20-4920-9162-695e6ddcb090-cni-binary-copy\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953227 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64517238-bfef-43e1-b543-1eea5b7f9c79-proxy-tls\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953884 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-system-cni-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953931 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sntl\" (UniqueName: \"kubernetes.io/projected/b9532c0a-d4bd-4454-b521-bf157bf3707c-kube-api-access-4sntl\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953970 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ed47238-6d20-4920-9162-695e6ddcb090-multus-daemon-config\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.953973 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-cnibin\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954001 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-cnibin\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954054 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-kubelet\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954093 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-cni-multus\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-os-release\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954155 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-k8s-cni-cncf-io\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954174 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9532c0a-d4bd-4454-b521-bf157bf3707c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954194 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-os-release\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954210 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-system-cni-dir\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954230 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-cni-multus\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954233 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-socket-dir-parent\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954155 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-var-lib-kubelet\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-etc-kubernetes\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954113 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-cnibin\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954280 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954301 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8x7\" (UniqueName: \"kubernetes.io/projected/7ed47238-6d20-4920-9162-695e6ddcb090-kube-api-access-lg8x7\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954315 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-os-release\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954316 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-k8s-cni-cncf-io\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954331 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-cni-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954346 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-system-cni-dir\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954372 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-etc-kubernetes\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954370 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-netns\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954350 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-host-run-netns\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954407 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-conf-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954446 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-conf-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-os-release\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954552 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64517238-bfef-43e1-b543-1eea5b7f9c79-mcd-auth-proxy-config\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954580 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-cni-dir\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954264 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7ed47238-6d20-4920-9162-695e6ddcb090-multus-socket-dir-parent\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954751 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7ed47238-6d20-4920-9162-695e6ddcb090-multus-daemon-config\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.954779 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9532c0a-d4bd-4454-b521-bf157bf3707c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.955029 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9532c0a-d4bd-4454-b521-bf157bf3707c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.958231 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64517238-bfef-43e1-b543-1eea5b7f9c79-proxy-tls\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.968163 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdn87\" (UniqueName: \"kubernetes.io/projected/64517238-bfef-43e1-b543-1eea5b7f9c79-kube-api-access-qdn87\") pod \"machine-config-daemon-ddctv\" (UID: \"64517238-bfef-43e1-b543-1eea5b7f9c79\") " pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.968347 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8x7\" (UniqueName: \"kubernetes.io/projected/7ed47238-6d20-4920-9162-695e6ddcb090-kube-api-access-lg8x7\") pod \"multus-5jrmb\" (UID: \"7ed47238-6d20-4920-9162-695e6ddcb090\") " pod="openshift-multus/multus-5jrmb" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.974161 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sntl\" (UniqueName: \"kubernetes.io/projected/b9532c0a-d4bd-4454-b521-bf157bf3707c-kube-api-access-4sntl\") pod \"multus-additional-cni-plugins-dl247\" (UID: \"b9532c0a-d4bd-4454-b521-bf157bf3707c\") " pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.987977 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.988001 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.988009 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.988024 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:25 crc kubenswrapper[4886]: I0314 08:29:25.988034 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:25Z","lastTransitionTime":"2026-03-14T08:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.065205 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:29:26 crc kubenswrapper[4886]: W0314 08:29:26.076628 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64517238_bfef_43e1_b543_1eea5b7f9c79.slice/crio-8f1b330ee9226f66773cff6996896477b9cd71e3c18ce02e8f8d067489dc08a7 WatchSource:0}: Error finding container 8f1b330ee9226f66773cff6996896477b9cd71e3c18ce02e8f8d067489dc08a7: Status 404 returned error can't find the container with id 8f1b330ee9226f66773cff6996896477b9cd71e3c18ce02e8f8d067489dc08a7 Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.079701 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5jrmb" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.088956 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dl247" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.090523 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.090608 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.090629 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.090705 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.090731 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: W0314 08:29:26.097809 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed47238_6d20_4920_9162_695e6ddcb090.slice/crio-becf3d5576b274b7215216e3328e64fe5ecad2bcb205b81beaa1229a5b6290a3 WatchSource:0}: Error finding container becf3d5576b274b7215216e3328e64fe5ecad2bcb205b81beaa1229a5b6290a3: Status 404 returned error can't find the container with id becf3d5576b274b7215216e3328e64fe5ecad2bcb205b81beaa1229a5b6290a3 Mar 14 08:29:26 crc kubenswrapper[4886]: W0314 08:29:26.106152 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9532c0a_d4bd_4454_b521_bf157bf3707c.slice/crio-b3e43d58e469e26e06e390e4c8232f2984bdd460a39bb1e36afae779988953c7 WatchSource:0}: Error finding container b3e43d58e469e26e06e390e4c8232f2984bdd460a39bb1e36afae779988953c7: Status 404 returned error can't find the container with id b3e43d58e469e26e06e390e4c8232f2984bdd460a39bb1e36afae779988953c7 Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.114519 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ms4h7"] Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.115323 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.117853 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.118006 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.118039 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.118091 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.118197 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.118475 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.118779 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.128391 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.139981 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.148009 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155674 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-ovn\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155698 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-log-socket\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155714 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-env-overrides\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155732 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-systemd\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155747 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-systemd-units\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155760 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovn-node-metrics-cert\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155783 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-netns\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155799 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-var-lib-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155815 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-kubelet\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155828 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-etc-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155842 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-netd\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155857 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcw6x\" (UniqueName: \"kubernetes.io/projected/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-kube-api-access-jcw6x\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155879 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-node-log\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155894 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-config\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155908 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-slash\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155922 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-ovn-kubernetes\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155965 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-script-lib\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155980 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.155997 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-bin\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.164109 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.174257 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.192702 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.192743 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.192752 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.192766 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.192775 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.194501 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.209115 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.220140 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.228837 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.237151 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.253588 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257233 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-script-lib\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257276 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257299 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-bin\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257316 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-ovn\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257334 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-log-socket\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257347 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-env-overrides\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257363 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-systemd\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257379 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-systemd-units\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257394 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovn-node-metrics-cert\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257413 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-var-lib-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257409 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-bin\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257436 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-netns\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257471 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-netns\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257480 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-kubelet\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257482 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257533 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-etc-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257508 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-etc-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257742 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-kubelet\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257758 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-netd\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257775 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-systemd\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257789 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcw6x\" (UniqueName: \"kubernetes.io/projected/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-kube-api-access-jcw6x\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257809 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-var-lib-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257828 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-node-log\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257843 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-netd\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257848 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-config\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257854 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-log-socket\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257871 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-slash\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257880 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-node-log\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257892 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257877 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-systemd-units\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257962 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-slash\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257933 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-ovn-kubernetes\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257993 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-openvswitch\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257910 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-ovn-kubernetes\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.257817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-ovn\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.258297 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-script-lib\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.258299 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-env-overrides\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.258509 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-config\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.261821 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovn-node-metrics-cert\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.275475 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcw6x\" (UniqueName: \"kubernetes.io/projected/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-kube-api-access-jcw6x\") pod \"ovnkube-node-ms4h7\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.294419 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.294450 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.294520 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.294536 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.294561 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.397051 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.397111 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.397162 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.397190 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.397207 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.436197 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.436571 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:29:26 crc kubenswrapper[4886]: E0314 08:29:26.436822 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.437363 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 08:29:26 crc kubenswrapper[4886]: W0314 08:29:26.452931 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a3ba0c_1e4b_4d8f_8bf4_e37003b0bfea.slice/crio-6d26fa2c2409306e065a2dd8ef5a81ad7e8edd5931c0e2c62ccc06cf5088b89a WatchSource:0}: Error finding container 6d26fa2c2409306e065a2dd8ef5a81ad7e8edd5931c0e2c62ccc06cf5088b89a: Status 404 returned error can't find the container with id 6d26fa2c2409306e065a2dd8ef5a81ad7e8edd5931c0e2c62ccc06cf5088b89a Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.499060 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.499089 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.499111 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.499147 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.499161 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.602106 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.602175 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.602188 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.602222 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.602233 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.705106 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.705219 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.705249 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.705285 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.705314 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.808447 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.808516 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.808532 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.808560 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.808577 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.868566 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerStarted","Data":"d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.868687 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerStarted","Data":"becf3d5576b274b7215216e3328e64fe5ecad2bcb205b81beaa1229a5b6290a3"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.869931 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tzzd5" event={"ID":"b55610ab-0fec-46ef-8233-8b0825013fb9","Type":"ContainerStarted","Data":"b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.871891 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" exitCode=0 Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.871958 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.871999 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"6d26fa2c2409306e065a2dd8ef5a81ad7e8edd5931c0e2c62ccc06cf5088b89a"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.874498 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.874557 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.874578 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"8f1b330ee9226f66773cff6996896477b9cd71e3c18ce02e8f8d067489dc08a7"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.877279 4886 generic.go:334] "Generic (PLEG): container finished" podID="b9532c0a-d4bd-4454-b521-bf157bf3707c" containerID="0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72" exitCode=0 Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.877363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerDied","Data":"0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.877403 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerStarted","Data":"b3e43d58e469e26e06e390e4c8232f2984bdd460a39bb1e36afae779988953c7"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.877930 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:29:26 crc kubenswrapper[4886]: E0314 08:29:26.878139 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.892649 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.909790 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.911482 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.911532 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.911549 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.911570 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.911586 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:26Z","lastTransitionTime":"2026-03-14T08:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.925922 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.938759 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.950698 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.963133 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.979615 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:26 crc kubenswrapper[4886]: I0314 08:29:26.990017 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.000252 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.014424 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.014484 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.014504 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.014535 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.014556 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.016196 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.028949 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.054158 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.072583 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.086562 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.100169 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.112837 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.119376 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.119428 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.119442 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.119462 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.119478 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.124105 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.134474 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.144176 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.156401 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.171268 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.183705 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.192853 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.208257 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.221822 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.221865 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.221876 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.221899 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.221912 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.324192 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.324498 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.324635 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.324726 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.324811 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.420078 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:27 crc kubenswrapper[4886]: E0314 08:29:27.420572 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.420604 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.420428 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:27 crc kubenswrapper[4886]: E0314 08:29:27.420957 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:27 crc kubenswrapper[4886]: E0314 08:29:27.421177 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.427352 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.427528 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.427623 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.427749 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.427847 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.531405 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.532113 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.532169 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.532186 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.532196 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.635744 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.635796 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.635806 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.635823 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.635834 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.738485 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.738538 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.738548 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.738565 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.738576 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.841681 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.841729 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.841746 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.841768 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.841787 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.884787 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.884839 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.884856 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.887994 4886 generic.go:334] "Generic (PLEG): container finished" podID="b9532c0a-d4bd-4454-b521-bf157bf3707c" containerID="3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202" exitCode=0 Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.888029 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerDied","Data":"3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.900389 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.912019 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.921413 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.929313 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.938189 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.944213 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.944249 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.944259 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.944273 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.944283 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:27Z","lastTransitionTime":"2026-03-14T08:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.949220 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.964026 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.975998 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:27 crc kubenswrapper[4886]: I0314 08:29:27.984866 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.001838 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.011476 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.026271 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.046520 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.046583 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.046604 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.046630 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.046649 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.149315 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.149361 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.149373 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.149391 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.149403 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.252346 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.252382 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.252392 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.252408 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.252419 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.354709 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.354760 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.354771 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.354789 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.354800 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.458481 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.458606 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.458629 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.458661 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.458687 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.561812 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.561874 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.561892 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.561918 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.561936 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.665236 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.665304 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.665322 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.665389 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.665409 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.767519 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.767584 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.767603 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.767622 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.767636 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.869822 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.869878 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.869887 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.869902 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.869911 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.892509 4886 generic.go:334] "Generic (PLEG): container finished" podID="b9532c0a-d4bd-4454-b521-bf157bf3707c" containerID="f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d" exitCode=0 Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.892571 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerDied","Data":"f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.896204 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.896235 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.896246 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.904860 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.920264 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.931667 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.945525 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.952761 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.961353 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.971820 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.972386 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.972443 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.972461 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.972486 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.972502 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:28Z","lastTransitionTime":"2026-03-14T08:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:28 crc kubenswrapper[4886]: I0314 08:29:28.982550 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.005467 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.018995 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.061830 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.075307 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.075344 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.075356 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.075375 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.075387 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.078451 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.181901 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.182361 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.182375 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.182392 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.182405 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.284624 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.284671 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.284684 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.284702 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.284714 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.387349 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.387391 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.387401 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.387414 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.387424 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.420593 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.420669 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.420751 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:29 crc kubenswrapper[4886]: E0314 08:29:29.420876 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:29 crc kubenswrapper[4886]: E0314 08:29:29.421043 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:29 crc kubenswrapper[4886]: E0314 08:29:29.421115 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.489381 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.489426 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.489441 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.489464 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.489510 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.591486 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.591539 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.591557 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.591582 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.591600 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.694352 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.694413 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.694431 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.694455 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.694472 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.796786 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.796858 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.796884 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.796916 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.796939 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.899255 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.899308 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.899325 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.899348 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.899367 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:29Z","lastTransitionTime":"2026-03-14T08:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.903202 4886 generic.go:334] "Generic (PLEG): container finished" podID="b9532c0a-d4bd-4454-b521-bf157bf3707c" containerID="08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533" exitCode=0 Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.903301 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerDied","Data":"08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533"} Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.925476 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.940228 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.956156 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.974875 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:29 crc kubenswrapper[4886]: I0314 08:29:29.986918 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.002168 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.002227 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.002247 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.002270 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.002286 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.018011 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.036871 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.055717 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.070895 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.080064 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.091827 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.105364 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.105416 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.105432 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.105452 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.105467 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.107387 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.208459 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.208537 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.208554 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.208576 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.208592 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.310993 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.311038 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.311054 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.311075 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.311093 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.413806 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.413844 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.413853 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.413865 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.413874 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.515718 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.516039 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.516048 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.516062 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.516072 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.617701 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.617725 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.617734 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.617747 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.617755 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.719873 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.719910 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.719919 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.719934 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.719944 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.823170 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.823226 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.823245 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.823269 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.823286 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.908886 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.913558 4886 generic.go:334] "Generic (PLEG): container finished" podID="b9532c0a-d4bd-4454-b521-bf157bf3707c" containerID="5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967" exitCode=0 Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.913667 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerDied","Data":"5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.917251 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.917302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.925100 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.925151 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.925161 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.925152 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.925174 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.925246 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:30Z","lastTransitionTime":"2026-03-14T08:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.940659 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.966529 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.980152 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:30 crc kubenswrapper[4886]: I0314 08:29:30.997615 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.014842 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.025524 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.027869 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.027917 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.027937 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.027962 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.027978 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.039395 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.051894 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.072358 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.090953 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.110215 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.130659 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.130709 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.130723 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.130744 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.130760 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.132178 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.151962 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.171689 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.190923 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.214779 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.229505 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.233347 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.233413 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.233432 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.233455 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.233472 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.251638 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.273405 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.293116 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.305433 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.315787 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.329088 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.336023 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.336054 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.336064 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.336081 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.336093 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.350725 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.419736 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.419778 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.419800 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:31 crc kubenswrapper[4886]: E0314 08:29:31.419949 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:31 crc kubenswrapper[4886]: E0314 08:29:31.420436 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:31 crc kubenswrapper[4886]: E0314 08:29:31.420604 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.438302 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.438352 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.438369 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.438390 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.438408 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.540679 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.540742 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.540767 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.540792 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.540813 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.644313 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.644349 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.644361 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.644376 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.644385 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.747780 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.747827 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.747837 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.747853 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.747864 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.850620 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.850670 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.850683 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.850702 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.850714 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.931650 4886 generic.go:334] "Generic (PLEG): container finished" podID="b9532c0a-d4bd-4454-b521-bf157bf3707c" containerID="60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6" exitCode=0 Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.931695 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerDied","Data":"60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.955002 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.955048 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.955062 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.955082 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.955095 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:31Z","lastTransitionTime":"2026-03-14T08:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.956671 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.975464 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.986348 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.989946 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bzkzj"] Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.990305 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.991903 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.992180 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.992405 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.994442 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 08:29:31 crc kubenswrapper[4886]: I0314 08:29:31.998993 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:31Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.015746 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-serviceca\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.015876 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-host\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.015959 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx6l8\" (UniqueName: \"kubernetes.io/projected/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-kube-api-access-cx6l8\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.021431 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.039380 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.055187 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.056811 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.056856 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.056868 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.056883 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.056894 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.069075 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.084487 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.096534 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.111225 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.116703 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-host\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.116752 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx6l8\" (UniqueName: \"kubernetes.io/projected/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-kube-api-access-cx6l8\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.116794 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-serviceca\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.116859 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-host\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.117736 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-serviceca\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.136014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx6l8\" (UniqueName: \"kubernetes.io/projected/f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5-kube-api-access-cx6l8\") pod \"node-ca-bzkzj\" (UID: \"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\") " pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.138784 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.150142 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.162773 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.162818 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.162828 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.162846 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.162856 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.167854 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.180886 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.190664 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.202615 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.213593 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.230360 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.252738 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.265003 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.265045 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.265058 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.265074 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.265085 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.269730 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.288041 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.305566 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.306071 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bzkzj" Mar 14 08:29:32 crc kubenswrapper[4886]: W0314 08:29:32.319311 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24061e7_5b3a_4dad_8ae1_c1b8d92e1ce5.slice/crio-0fc66ea9fa717df9255d796fa24cbe0cd1eebb23eb58d5ebca0140547ba4eeed WatchSource:0}: Error finding container 0fc66ea9fa717df9255d796fa24cbe0cd1eebb23eb58d5ebca0140547ba4eeed: Status 404 returned error can't find the container with id 0fc66ea9fa717df9255d796fa24cbe0cd1eebb23eb58d5ebca0140547ba4eeed Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.325771 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.334599 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.367065 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.367096 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.367108 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.367139 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.367151 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.470848 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.470879 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.470888 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.470901 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.470910 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.573377 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.573436 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.573457 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.573482 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.573499 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.676246 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.676277 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.676285 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.676301 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.676309 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.778207 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.778242 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.778253 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.778272 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.778282 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.881596 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.881631 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.881647 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.881667 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.881684 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.940891 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" event={"ID":"b9532c0a-d4bd-4454-b521-bf157bf3707c","Type":"ContainerStarted","Data":"5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.943550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bzkzj" event={"ID":"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5","Type":"ContainerStarted","Data":"317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.943579 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bzkzj" event={"ID":"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5","Type":"ContainerStarted","Data":"0fc66ea9fa717df9255d796fa24cbe0cd1eebb23eb58d5ebca0140547ba4eeed"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.953535 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.955219 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.955373 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.955434 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.956855 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.958329 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.975784 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.985006 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.985061 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.985074 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.985094 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.985107 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:32Z","lastTransitionTime":"2026-03-14T08:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.985991 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.990473 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:32 crc kubenswrapper[4886]: I0314 08:29:32.990516 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:32Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.002741 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.014001 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.025298 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.036269 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.046154 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.063011 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.075460 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.086606 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.093617 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.093679 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.093697 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.093715 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.093725 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.109766 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.123137 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.143346 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.159043 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.173095 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.181977 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.191378 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.198218 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.198250 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.198259 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.198275 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.198284 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.206089 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.218427 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.230109 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.231744 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.231991 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:49.231959408 +0000 UTC m=+124.480411075 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.232156 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.232205 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.232378 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.232422 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.232468 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:49.232448052 +0000 UTC m=+124.480899689 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.232489 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:49.232480833 +0000 UTC m=+124.480932460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.242396 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.264268 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.278965 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.291106 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.301589 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.301639 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.301652 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.301671 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.301684 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.317955 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:33Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.333455 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.333524 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.333718 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.333755 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.333773 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.333838 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:49.333816915 +0000 UTC m=+124.582268562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.334173 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.334314 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.334420 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.334587 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:29:49.334566126 +0000 UTC m=+124.583017773 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.405196 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.405629 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.405827 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.405995 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.406173 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.420368 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.420501 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.420621 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.420855 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.420986 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:33 crc kubenswrapper[4886]: E0314 08:29:33.421493 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.509885 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.509997 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.510019 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.510048 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.510070 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.613663 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.613725 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.613745 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.613775 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.613798 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.716165 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.716241 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.716262 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.716294 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.716314 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.818719 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.819010 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.819093 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.819279 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.819332 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.922206 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.922292 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.922312 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.922345 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:33 crc kubenswrapper[4886]: I0314 08:29:33.922366 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:33Z","lastTransitionTime":"2026-03-14T08:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.024681 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.024748 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.024766 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.024792 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.024818 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.128063 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.128161 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.128186 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.128213 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.128233 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.231270 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.231341 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.231359 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.231384 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.231401 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.334459 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.334550 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.334575 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.334609 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.334636 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.437910 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.437979 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.438004 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.438038 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.438065 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.540725 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.540786 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.540805 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.540834 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.540856 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.645575 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.645639 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.645663 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.645692 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.645712 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.749037 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.749105 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.749171 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.749207 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.749231 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.852222 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.852280 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.852293 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.852318 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.852338 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.954832 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.954897 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.954914 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.954947 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:34 crc kubenswrapper[4886]: I0314 08:29:34.954965 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:34Z","lastTransitionTime":"2026-03-14T08:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.057343 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.057433 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.057455 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.057487 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.057507 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.160219 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.160275 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.160287 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.160309 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.160324 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.262488 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.262535 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.262544 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.262558 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.262570 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.365636 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.365691 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.365710 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.365733 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.365751 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.420640 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.420695 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.420783 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:35 crc kubenswrapper[4886]: E0314 08:29:35.420851 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:35 crc kubenswrapper[4886]: E0314 08:29:35.421025 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:35 crc kubenswrapper[4886]: E0314 08:29:35.421109 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.438914 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.464225 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.467940 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.468048 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.468079 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.468116 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.468182 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.486621 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.516890 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.539895 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.554317 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.571858 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.571908 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.571921 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.571942 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.571959 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.573205 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.590532 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.605449 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.622294 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.639308 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.664728 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.675210 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.675698 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.675721 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.675732 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.675747 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.675758 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.779139 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.779220 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.779240 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.779270 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.779291 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.883083 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.883630 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.883826 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.884038 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.884261 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.972600 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/0.log" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.977940 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc" exitCode=1 Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.978021 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc"} Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.979413 4886 scope.go:117] "RemoveContainer" containerID="9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.997176 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.997226 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.997239 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.997257 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:35 crc kubenswrapper[4886]: I0314 08:29:35.997268 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:35Z","lastTransitionTime":"2026-03-14T08:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.007386 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.017195 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.017295 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.017323 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.017360 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.017384 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.029460 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: E0314 08:29:36.032571 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.037813 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.037855 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.037872 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.037890 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.037901 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.042257 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: E0314 08:29:36.053338 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.057825 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.057872 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.057889 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.057914 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.057930 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.060455 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: E0314 08:29:36.071856 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.076410 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.078827 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.078877 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.078930 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.078951 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.079349 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: E0314 08:29:36.094178 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.097318 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.099040 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.099178 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.099201 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.099228 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.099245 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: E0314 08:29:36.113181 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: E0314 08:29:36.113451 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.115503 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.115571 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.115588 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.115611 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.115630 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.118495 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.138767 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.152384 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.174259 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.193146 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 08:29:35.465025 6704 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:35.465041 6704 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 08:29:35.465135 6704 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:29:35.465140 6704 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465193 6704 factory.go:656] Stopping watch factory\\\\nI0314 08:29:35.465211 6704 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:29:35.465222 6704 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:35.465231 6704 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 08:29:35.465481 6704 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465620 6704 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:35.465685 6704 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.215317 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.218730 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.218784 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.218797 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.218854 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.218875 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.237061 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:36Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.322477 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.322534 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.322549 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.322573 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.322591 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.425429 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.425620 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.425651 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.425676 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.425695 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.528510 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.528557 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.528573 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.528593 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.528705 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.633838 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.634324 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.634350 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.634379 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.634399 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.737452 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.737499 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.737510 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.737532 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.737548 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.840772 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.840830 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.840843 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.840865 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.840881 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.944509 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.944571 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.944584 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.944603 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.944616 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:36Z","lastTransitionTime":"2026-03-14T08:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.991725 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/0.log" Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.995159 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82"} Mar 14 08:29:36 crc kubenswrapper[4886]: I0314 08:29:36.996083 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.011813 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.022973 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.040911 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.046947 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.047010 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.047024 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.047043 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.047055 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.055905 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.070810 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.088298 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.101850 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.118426 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.134204 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.148824 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.149086 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.149234 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.149393 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.149511 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.152426 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 08:29:35.465025 6704 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:35.465041 6704 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 08:29:35.465135 6704 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:29:35.465140 6704 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465193 6704 factory.go:656] Stopping watch factory\\\\nI0314 08:29:35.465211 6704 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:29:35.465222 6704 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:35.465231 6704 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 08:29:35.465481 6704 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465620 6704 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:35.465685 6704 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.167031 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.181224 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.200383 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.253302 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.253378 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.253397 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.253427 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.253451 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.356910 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.356967 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.356984 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.357007 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.357028 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.420822 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.420829 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.421006 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:37 crc kubenswrapper[4886]: E0314 08:29:37.421026 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:37 crc kubenswrapper[4886]: E0314 08:29:37.421339 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:37 crc kubenswrapper[4886]: E0314 08:29:37.421239 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.460875 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.460925 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.460941 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.460971 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.460990 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.563732 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.563797 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.563816 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.563842 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.563860 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.666675 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.666747 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.666773 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.666806 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.666830 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.770187 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.770268 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.770288 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.770319 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.770341 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.872973 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.873051 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.873072 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.873102 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.873163 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.976006 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.976072 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.976089 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.976113 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:37 crc kubenswrapper[4886]: I0314 08:29:37.976159 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:37Z","lastTransitionTime":"2026-03-14T08:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.003432 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/1.log" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.004810 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/0.log" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.010047 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82" exitCode=1 Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.010163 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.010244 4886 scope.go:117] "RemoveContainer" containerID="9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.011305 4886 scope.go:117] "RemoveContainer" containerID="38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82" Mar 14 08:29:38 crc kubenswrapper[4886]: E0314 08:29:38.011626 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.040346 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.067423 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.078909 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.079189 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.079311 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.079404 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.079493 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.090278 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.093450 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q"] Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.094044 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.098533 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.100198 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.112574 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.134408 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.162964 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.182818 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.182883 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.182903 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.182932 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.182951 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.190485 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.193956 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30290a10-cfb2-4981-b885-384f20bea696-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.194196 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30290a10-cfb2-4981-b885-384f20bea696-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.194328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30290a10-cfb2-4981-b885-384f20bea696-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.194532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8np2t\" (UniqueName: \"kubernetes.io/projected/30290a10-cfb2-4981-b885-384f20bea696-kube-api-access-8np2t\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.208866 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.229050 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.248722 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.267218 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.286159 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.286415 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.286498 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.286588 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.286672 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.290660 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.295792 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30290a10-cfb2-4981-b885-384f20bea696-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.295878 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30290a10-cfb2-4981-b885-384f20bea696-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.295913 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30290a10-cfb2-4981-b885-384f20bea696-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.295943 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8np2t\" (UniqueName: \"kubernetes.io/projected/30290a10-cfb2-4981-b885-384f20bea696-kube-api-access-8np2t\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.296620 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/30290a10-cfb2-4981-b885-384f20bea696-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.296981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/30290a10-cfb2-4981-b885-384f20bea696-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.304716 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/30290a10-cfb2-4981-b885-384f20bea696-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.333733 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8np2t\" (UniqueName: \"kubernetes.io/projected/30290a10-cfb2-4981-b885-384f20bea696-kube-api-access-8np2t\") pod \"ovnkube-control-plane-749d76644c-ftn4q\" (UID: \"30290a10-cfb2-4981-b885-384f20bea696\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.334333 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 08:29:35.465025 6704 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:35.465041 6704 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 08:29:35.465135 6704 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:29:35.465140 6704 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465193 6704 factory.go:656] Stopping watch factory\\\\nI0314 08:29:35.465211 6704 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:29:35.465222 6704 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:35.465231 6704 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 08:29:35.465481 6704 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465620 6704 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:35.465685 6704 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.357305 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.378742 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 08:29:35.465025 6704 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:35.465041 6704 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 08:29:35.465135 6704 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:29:35.465140 6704 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465193 6704 factory.go:656] Stopping watch factory\\\\nI0314 08:29:35.465211 6704 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:29:35.465222 6704 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:35.465231 6704 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 08:29:35.465481 6704 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465620 6704 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:35.465685 6704 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.389075 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.389328 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.389401 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.389500 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.389572 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.391851 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.405373 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.409327 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.419157 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: W0314 08:29:38.420515 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30290a10_cfb2_4981_b885_384f20bea696.slice/crio-85a2af1fbdab791d58ffad392fbc21b07d24c8c1e9ae0aefa31e686f24c623de WatchSource:0}: Error finding container 85a2af1fbdab791d58ffad392fbc21b07d24c8c1e9ae0aefa31e686f24c623de: Status 404 returned error can't find the container with id 85a2af1fbdab791d58ffad392fbc21b07d24c8c1e9ae0aefa31e686f24c623de Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.434430 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.447477 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.458743 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.470687 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.482175 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.492393 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.492447 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.492463 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.492488 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.492504 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.498851 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.515504 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.536605 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.549515 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.595009 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.595068 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.595083 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.595109 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.595149 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.697746 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.697795 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.697810 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.697829 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.697845 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.800392 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.800424 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.800433 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.800448 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.800458 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.868677 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hq6j4"] Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.869839 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:38 crc kubenswrapper[4886]: E0314 08:29:38.870111 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.887158 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.902386 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.902436 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.902452 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.902475 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.902489 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:38Z","lastTransitionTime":"2026-03-14T08:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.919232 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9818e670cf7ff9435c00d98d089d88f95cb8b0b354b2fcf66435af0ff2211dcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0314 08:29:35.465025 6704 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:35.465041 6704 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0314 08:29:35.465135 6704 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0314 08:29:35.465140 6704 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465193 6704 factory.go:656] Stopping watch factory\\\\nI0314 08:29:35.465211 6704 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0314 08:29:35.465222 6704 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:35.465231 6704 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0314 08:29:35.465481 6704 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0314 08:29:35.465620 6704 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:35.465685 6704 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.943086 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.958182 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.969103 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.981420 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:38 crc kubenswrapper[4886]: I0314 08:29:38.993648 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:38Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.002440 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2r5j\" (UniqueName: \"kubernetes.io/projected/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-kube-api-access-t2r5j\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.002645 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.004367 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.004460 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.004539 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.004601 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.004665 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.010305 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.015196 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/1.log" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.021636 4886 scope.go:117] "RemoveContainer" containerID="38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82" Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.021863 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.024548 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" event={"ID":"30290a10-cfb2-4981-b885-384f20bea696","Type":"ContainerStarted","Data":"83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.024597 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" event={"ID":"30290a10-cfb2-4981-b885-384f20bea696","Type":"ContainerStarted","Data":"2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.024613 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" event={"ID":"30290a10-cfb2-4981-b885-384f20bea696","Type":"ContainerStarted","Data":"85a2af1fbdab791d58ffad392fbc21b07d24c8c1e9ae0aefa31e686f24c623de"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.028428 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.043073 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.056261 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.070345 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.087957 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.101245 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.103601 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2r5j\" (UniqueName: \"kubernetes.io/projected/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-kube-api-access-t2r5j\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.104482 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.104539 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.104597 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:39.604576513 +0000 UTC m=+114.853028140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.107356 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.107412 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.107431 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.107460 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.107481 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.117634 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.126524 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2r5j\" (UniqueName: \"kubernetes.io/projected/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-kube-api-access-t2r5j\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.133474 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.154096 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.171405 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.188577 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.204082 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.210665 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.210708 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.210723 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.211033 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.211077 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.216496 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.229560 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.243298 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.258815 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.274222 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.288257 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.299865 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.314006 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.314079 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.314099 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.314152 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.314172 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.316594 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.333106 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.351729 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:39Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.416864 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.416910 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.416921 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.416940 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.416951 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.420442 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.420520 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.420442 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.420585 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.420890 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.421108 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.421153 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.519863 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.519899 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.519908 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.519922 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.519932 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.607792 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.607955 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:39 crc kubenswrapper[4886]: E0314 08:29:39.608058 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:40.608037628 +0000 UTC m=+115.856489275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.622872 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.622919 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.622929 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.622945 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.622957 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.726178 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.726222 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.726242 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.726280 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.726297 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.829723 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.829761 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.829770 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.829785 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.829795 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.932908 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.933242 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.933366 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.933460 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:39 crc kubenswrapper[4886]: I0314 08:29:39.933544 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:39Z","lastTransitionTime":"2026-03-14T08:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.030472 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.032448 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.032990 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.036076 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.036321 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.036456 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.036597 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.036721 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.052779 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.067230 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.086248 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.112670 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.130560 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.139842 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.140073 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.140254 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.140421 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.140559 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.149539 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.170773 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.199323 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.224576 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.243688 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.244273 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.244286 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.244302 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.244336 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.245821 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.265178 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.282243 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.297601 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.313923 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.328205 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:40Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.346598 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.346642 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.346651 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.346666 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.346676 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.420232 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:40 crc kubenswrapper[4886]: E0314 08:29:40.420378 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.449422 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.449465 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.449481 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.449504 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.449522 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.552228 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.552275 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.552287 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.552310 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.552322 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.619066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:40 crc kubenswrapper[4886]: E0314 08:29:40.619302 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:40 crc kubenswrapper[4886]: E0314 08:29:40.619404 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:42.619378551 +0000 UTC m=+117.867830228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.657088 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.657181 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.657199 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.657221 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.657238 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.760301 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.760345 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.760355 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.760370 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.760379 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.863261 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.863343 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.863361 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.863398 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.863430 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.966988 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.967050 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.967063 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.967090 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:40 crc kubenswrapper[4886]: I0314 08:29:40.967109 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:40Z","lastTransitionTime":"2026-03-14T08:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.070020 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.070082 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.070100 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.070151 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.070169 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.173044 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.173165 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.173187 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.173218 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.173241 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.276105 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.276238 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.276263 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.276291 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.276309 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.379856 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.379939 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.379958 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.379981 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.379999 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.420216 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.420314 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:41 crc kubenswrapper[4886]: E0314 08:29:41.420454 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.420532 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:41 crc kubenswrapper[4886]: E0314 08:29:41.420676 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:41 crc kubenswrapper[4886]: E0314 08:29:41.420763 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.482621 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.482670 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.482686 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.482708 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.482724 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.591561 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.591643 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.591667 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.591696 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.591718 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.693425 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.693468 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.693479 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.693493 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.693506 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.796327 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.796400 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.796417 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.796444 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.796461 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.899486 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.899548 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.899564 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.899588 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:41 crc kubenswrapper[4886]: I0314 08:29:41.899606 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:41Z","lastTransitionTime":"2026-03-14T08:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.001491 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.001530 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.001539 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.001553 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.001563 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.103592 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.103630 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.103639 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.103653 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.103663 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.206187 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.206242 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.206254 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.206272 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.206285 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.308627 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.308685 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.308696 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.308715 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.308727 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.411692 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.411735 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.411746 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.411758 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.411767 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.420070 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:42 crc kubenswrapper[4886]: E0314 08:29:42.420223 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.513984 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.514017 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.514026 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.514042 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.514053 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.616156 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.616215 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.616231 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.616246 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.616256 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.641500 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:42 crc kubenswrapper[4886]: E0314 08:29:42.641594 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:42 crc kubenswrapper[4886]: E0314 08:29:42.641645 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:46.641631606 +0000 UTC m=+121.890083243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.718729 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.718852 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.718863 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.718879 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.718888 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.821234 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.821271 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.821281 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.821296 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.821305 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.923736 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.923771 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.923780 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.923793 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:42 crc kubenswrapper[4886]: I0314 08:29:42.923802 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:42Z","lastTransitionTime":"2026-03-14T08:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.026262 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.026312 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.026324 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.026343 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.026360 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.128328 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.128388 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.128405 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.128429 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.128447 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.232542 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.232596 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.232619 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.232647 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.232730 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.335095 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.335178 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.335196 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.335222 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.335239 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.420610 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:43 crc kubenswrapper[4886]: E0314 08:29:43.421043 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.420764 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:43 crc kubenswrapper[4886]: E0314 08:29:43.421424 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.420625 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:43 crc kubenswrapper[4886]: E0314 08:29:43.421540 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.437712 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.437757 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.437769 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.437788 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.437801 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.540248 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.541270 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.541308 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.541332 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.541349 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.643400 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.643445 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.643473 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.643489 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.643498 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.746054 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.746108 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.746145 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.746163 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.746179 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.848894 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.848949 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.848962 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.848981 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.848994 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.952074 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.952161 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.952171 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.952185 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:43 crc kubenswrapper[4886]: I0314 08:29:43.952194 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:43Z","lastTransitionTime":"2026-03-14T08:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.054478 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.054786 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.054906 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.055035 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.055197 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.157155 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.157203 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.157213 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.157229 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.157240 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.260460 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.260499 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.260508 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.260520 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.260529 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.362963 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.363052 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.363060 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.363074 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.363082 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.420332 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:44 crc kubenswrapper[4886]: E0314 08:29:44.420876 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.426940 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.464559 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.464587 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.464596 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.464624 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.464634 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.566970 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.566999 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.567021 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.567036 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.567044 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.668900 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.668947 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.668960 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.668975 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.668985 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.770730 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.770763 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.770774 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.770828 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.770839 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.873083 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.873138 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.873153 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.873167 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.873178 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.975170 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.975204 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.975214 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.975228 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:44 crc kubenswrapper[4886]: I0314 08:29:44.975240 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:44Z","lastTransitionTime":"2026-03-14T08:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.078701 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.078755 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.078875 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.078897 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.078908 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:45Z","lastTransitionTime":"2026-03-14T08:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.182025 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.182100 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.182636 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.182720 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.183021 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:45Z","lastTransitionTime":"2026-03-14T08:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.286015 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.286096 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.286150 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.286186 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.286211 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:45Z","lastTransitionTime":"2026-03-14T08:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:45 crc kubenswrapper[4886]: E0314 08:29:45.386455 4886 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.420524 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.420536 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.420563 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:45 crc kubenswrapper[4886]: E0314 08:29:45.420743 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:45 crc kubenswrapper[4886]: E0314 08:29:45.420991 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:45 crc kubenswrapper[4886]: E0314 08:29:45.421236 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.440216 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.465970 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.488778 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: E0314 08:29:45.494110 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.506307 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.523146 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.538415 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.559633 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.575486 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.607179 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.628727 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.643183 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.656745 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.670608 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.685549 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.709973 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:45 crc kubenswrapper[4886]: I0314 08:29:45.723320 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.246577 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.246651 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.246671 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.246697 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.246715 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:46Z","lastTransitionTime":"2026-03-14T08:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.267327 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.271546 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.271605 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.271630 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.271657 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.271675 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:46Z","lastTransitionTime":"2026-03-14T08:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.289589 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.294381 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.294471 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.294494 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.294553 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.294588 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:46Z","lastTransitionTime":"2026-03-14T08:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.313838 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.319184 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.319463 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.319675 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.319893 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.320210 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:46Z","lastTransitionTime":"2026-03-14T08:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.341442 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.346880 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.346947 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.346957 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.346973 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.346988 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:46Z","lastTransitionTime":"2026-03-14T08:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.365020 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:46Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.365936 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.420333 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.420462 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:46 crc kubenswrapper[4886]: I0314 08:29:46.685574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.685838 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:46 crc kubenswrapper[4886]: E0314 08:29:46.685974 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:29:54.685957561 +0000 UTC m=+129.934409198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:47 crc kubenswrapper[4886]: I0314 08:29:47.420482 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:47 crc kubenswrapper[4886]: E0314 08:29:47.420599 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:47 crc kubenswrapper[4886]: I0314 08:29:47.420491 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:47 crc kubenswrapper[4886]: E0314 08:29:47.420795 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:47 crc kubenswrapper[4886]: I0314 08:29:47.420810 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:47 crc kubenswrapper[4886]: E0314 08:29:47.420951 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:48 crc kubenswrapper[4886]: I0314 08:29:48.420581 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:48 crc kubenswrapper[4886]: E0314 08:29:48.421921 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.315193 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.315495 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:30:21.315442762 +0000 UTC m=+156.563894459 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.315628 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.315724 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.315904 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.315904 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.316014 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:30:21.315992737 +0000 UTC m=+156.564444404 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.316049 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:30:21.316033478 +0000 UTC m=+156.564485155 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.417611 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.417970 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418024 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418048 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.418116 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418181 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:30:21.418150363 +0000 UTC m=+156.666602040 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418263 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418289 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418302 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.418355 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:30:21.418340848 +0000 UTC m=+156.666792495 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.423377 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.423445 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:49 crc kubenswrapper[4886]: I0314 08:29:49.423377 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.424455 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.425077 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:49 crc kubenswrapper[4886]: E0314 08:29:49.425212 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:50 crc kubenswrapper[4886]: I0314 08:29:50.420413 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:50 crc kubenswrapper[4886]: E0314 08:29:50.420639 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:50 crc kubenswrapper[4886]: E0314 08:29:50.496422 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.419865 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.419983 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.419865 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:51 crc kubenswrapper[4886]: E0314 08:29:51.420102 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:51 crc kubenswrapper[4886]: E0314 08:29:51.420282 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:51 crc kubenswrapper[4886]: E0314 08:29:51.420463 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.451180 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.474100 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.515946 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.550423 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.571361 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.587179 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.604182 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.619401 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.633758 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.647005 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.659628 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.677930 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.694618 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.716110 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.731750 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.743050 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:51 crc kubenswrapper[4886]: I0314 08:29:51.796963 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:51Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:52 crc kubenswrapper[4886]: I0314 08:29:52.419720 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:52 crc kubenswrapper[4886]: E0314 08:29:52.420372 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:53 crc kubenswrapper[4886]: I0314 08:29:53.420541 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:53 crc kubenswrapper[4886]: I0314 08:29:53.420761 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:53 crc kubenswrapper[4886]: E0314 08:29:53.420836 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:53 crc kubenswrapper[4886]: I0314 08:29:53.420859 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:53 crc kubenswrapper[4886]: E0314 08:29:53.421088 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:53 crc kubenswrapper[4886]: E0314 08:29:53.421741 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:53 crc kubenswrapper[4886]: I0314 08:29:53.422162 4886 scope.go:117] "RemoveContainer" containerID="38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.084577 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/1.log" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.087493 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799"} Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.087962 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.113402 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.141257 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.157100 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.170285 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.183015 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.195880 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.208200 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.230293 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.244734 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.259945 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.273393 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.287653 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.300546 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.312344 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.329868 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.342455 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:54Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.419661 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:54 crc kubenswrapper[4886]: E0314 08:29:54.419785 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:54 crc kubenswrapper[4886]: I0314 08:29:54.787608 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:54 crc kubenswrapper[4886]: E0314 08:29:54.787888 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:54 crc kubenswrapper[4886]: E0314 08:29:54.787996 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:30:10.787965488 +0000 UTC m=+146.036417165 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.094513 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/2.log" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.095804 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/1.log" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.100440 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799" exitCode=1 Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.100562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799"} Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.100672 4886 scope.go:117] "RemoveContainer" containerID="38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.101869 4886 scope.go:117] "RemoveContainer" containerID="d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799" Mar 14 08:29:55 crc kubenswrapper[4886]: E0314 08:29:55.102311 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.132474 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.155472 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.178412 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.196680 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.216705 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.234481 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.252474 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.265887 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.281788 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.298666 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.315503 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.338556 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.355361 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.378872 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.396059 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.420605 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.420700 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.420756 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:55 crc kubenswrapper[4886]: E0314 08:29:55.421010 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:55 crc kubenswrapper[4886]: E0314 08:29:55.421150 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:55 crc kubenswrapper[4886]: E0314 08:29:55.421256 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.424371 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.445432 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.462406 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.479197 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: E0314 08:29:55.497020 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.499807 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.518064 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.541408 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.560063 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.578905 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.602244 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.617573 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.634794 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.649944 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.682573 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38dd161218d4037527292fb96b271441859c1846bfa11e10661e9bef768cdb82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:37Z\\\",\\\"message\\\":\\\"er/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111574 6854 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.111872 6854 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0314 08:29:37.112362 6854 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0314 08:29:37.112438 6854 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0314 08:29:37.112497 6854 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0314 08:29:37.112531 6854 factory.go:656] Stopping watch factory\\\\nI0314 08:29:37.112575 6854 handler.go:208] Removed *v1.Node event handler 2\\\\nI0314 08:29:37.116090 6854 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0314 08:29:37.116111 6854 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0314 08:29:37.116186 6854 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:37.116227 6854 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:37.116315 6854 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.706481 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.720042 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:55 crc kubenswrapper[4886]: I0314 08:29:55.739015 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:55Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.106654 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/2.log" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.109614 4886 scope.go:117] "RemoveContainer" containerID="d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799" Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.109744 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.125856 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.139107 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.158434 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.169914 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.185428 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.201189 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.216627 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.229328 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.241426 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.255009 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.264689 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.276788 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.297894 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.314737 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.333431 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.352714 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.420341 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.420540 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.602507 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.603031 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.603165 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.603242 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.603333 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:56Z","lastTransitionTime":"2026-03-14T08:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.623392 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.628074 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.628184 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.628211 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.628243 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.628266 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:56Z","lastTransitionTime":"2026-03-14T08:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.650336 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.655642 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.655714 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.655731 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.655763 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.655783 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:56Z","lastTransitionTime":"2026-03-14T08:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.671749 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.675776 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.675809 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.675819 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.675835 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.675846 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:56Z","lastTransitionTime":"2026-03-14T08:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.687046 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.691808 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.691843 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.691852 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.691871 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:29:56 crc kubenswrapper[4886]: I0314 08:29:56.691883 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:29:56Z","lastTransitionTime":"2026-03-14T08:29:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.708197 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:29:56Z is after 2025-08-24T17:21:41Z" Mar 14 08:29:56 crc kubenswrapper[4886]: E0314 08:29:56.708308 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:29:57 crc kubenswrapper[4886]: I0314 08:29:57.419996 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:57 crc kubenswrapper[4886]: I0314 08:29:57.420094 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:57 crc kubenswrapper[4886]: E0314 08:29:57.420269 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:57 crc kubenswrapper[4886]: E0314 08:29:57.420570 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:29:57 crc kubenswrapper[4886]: I0314 08:29:57.420294 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:57 crc kubenswrapper[4886]: E0314 08:29:57.421287 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:58 crc kubenswrapper[4886]: I0314 08:29:58.419996 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:29:58 crc kubenswrapper[4886]: E0314 08:29:58.420213 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:29:59 crc kubenswrapper[4886]: I0314 08:29:59.420001 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:29:59 crc kubenswrapper[4886]: I0314 08:29:59.419992 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:29:59 crc kubenswrapper[4886]: E0314 08:29:59.420274 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:29:59 crc kubenswrapper[4886]: E0314 08:29:59.420492 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:29:59 crc kubenswrapper[4886]: I0314 08:29:59.420820 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:29:59 crc kubenswrapper[4886]: E0314 08:29:59.420981 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:00 crc kubenswrapper[4886]: I0314 08:30:00.419910 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:00 crc kubenswrapper[4886]: E0314 08:30:00.420483 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:00 crc kubenswrapper[4886]: I0314 08:30:00.435549 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 14 08:30:00 crc kubenswrapper[4886]: E0314 08:30:00.498740 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:01 crc kubenswrapper[4886]: I0314 08:30:01.420814 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:01 crc kubenswrapper[4886]: I0314 08:30:01.420906 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:01 crc kubenswrapper[4886]: I0314 08:30:01.420825 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:01 crc kubenswrapper[4886]: E0314 08:30:01.421087 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:01 crc kubenswrapper[4886]: E0314 08:30:01.421365 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:01 crc kubenswrapper[4886]: E0314 08:30:01.421591 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:02 crc kubenswrapper[4886]: I0314 08:30:02.420763 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:02 crc kubenswrapper[4886]: E0314 08:30:02.421037 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:03 crc kubenswrapper[4886]: I0314 08:30:03.419796 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:03 crc kubenswrapper[4886]: E0314 08:30:03.419951 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:03 crc kubenswrapper[4886]: I0314 08:30:03.419799 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:03 crc kubenswrapper[4886]: I0314 08:30:03.420005 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:03 crc kubenswrapper[4886]: E0314 08:30:03.420278 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:03 crc kubenswrapper[4886]: E0314 08:30:03.420309 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:04 crc kubenswrapper[4886]: I0314 08:30:04.420394 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:04 crc kubenswrapper[4886]: E0314 08:30:04.420591 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.420061 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:05 crc kubenswrapper[4886]: E0314 08:30:05.420291 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.420464 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.420520 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:05 crc kubenswrapper[4886]: E0314 08:30:05.420656 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:05 crc kubenswrapper[4886]: E0314 08:30:05.420780 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.447157 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.466203 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.492218 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: E0314 08:30:05.505174 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.518850 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.542267 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.557589 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.571777 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.583021 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.593739 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.606555 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.618450 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.633590 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.645278 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.662516 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.672502 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.683576 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:05 crc kubenswrapper[4886]: I0314 08:30:05.699897 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:05Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.419914 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.420435 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.741375 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.741876 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.741900 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.741930 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.741953 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:06Z","lastTransitionTime":"2026-03-14T08:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.762774 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:06Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.769831 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.769883 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.769900 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.769923 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.769941 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:06Z","lastTransitionTime":"2026-03-14T08:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.792442 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:06Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.799269 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.799331 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.799351 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.799375 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.799392 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:06Z","lastTransitionTime":"2026-03-14T08:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.822925 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:06Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.829971 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.830168 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.830202 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.830272 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.830303 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:06Z","lastTransitionTime":"2026-03-14T08:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.852841 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:06Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.857665 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.857747 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.857770 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.857795 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:06 crc kubenswrapper[4886]: I0314 08:30:06.857813 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:06Z","lastTransitionTime":"2026-03-14T08:30:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.877248 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:06Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:06 crc kubenswrapper[4886]: E0314 08:30:06.877466 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:30:07 crc kubenswrapper[4886]: I0314 08:30:07.420348 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:07 crc kubenswrapper[4886]: I0314 08:30:07.420470 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:07 crc kubenswrapper[4886]: E0314 08:30:07.420538 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:07 crc kubenswrapper[4886]: I0314 08:30:07.420641 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:07 crc kubenswrapper[4886]: E0314 08:30:07.420667 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:07 crc kubenswrapper[4886]: E0314 08:30:07.420832 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:08 crc kubenswrapper[4886]: I0314 08:30:08.420157 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:08 crc kubenswrapper[4886]: E0314 08:30:08.420730 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:09 crc kubenswrapper[4886]: I0314 08:30:09.420171 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:09 crc kubenswrapper[4886]: I0314 08:30:09.420229 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:09 crc kubenswrapper[4886]: E0314 08:30:09.420309 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:09 crc kubenswrapper[4886]: E0314 08:30:09.420384 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:09 crc kubenswrapper[4886]: I0314 08:30:09.420193 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:09 crc kubenswrapper[4886]: E0314 08:30:09.420535 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:09 crc kubenswrapper[4886]: I0314 08:30:09.421183 4886 scope.go:117] "RemoveContainer" containerID="d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799" Mar 14 08:30:09 crc kubenswrapper[4886]: E0314 08:30:09.421675 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:30:10 crc kubenswrapper[4886]: I0314 08:30:10.420859 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:10 crc kubenswrapper[4886]: E0314 08:30:10.421172 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:10 crc kubenswrapper[4886]: E0314 08:30:10.507728 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:10 crc kubenswrapper[4886]: I0314 08:30:10.886195 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:10 crc kubenswrapper[4886]: E0314 08:30:10.886420 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:30:10 crc kubenswrapper[4886]: E0314 08:30:10.886605 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:30:42.88656781 +0000 UTC m=+178.135019517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:30:11 crc kubenswrapper[4886]: I0314 08:30:11.420952 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:11 crc kubenswrapper[4886]: I0314 08:30:11.421020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:11 crc kubenswrapper[4886]: E0314 08:30:11.421205 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:11 crc kubenswrapper[4886]: E0314 08:30:11.421499 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:11 crc kubenswrapper[4886]: I0314 08:30:11.422316 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:11 crc kubenswrapper[4886]: E0314 08:30:11.422641 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:12 crc kubenswrapper[4886]: I0314 08:30:12.420084 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:12 crc kubenswrapper[4886]: E0314 08:30:12.420361 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.186028 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/0.log" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.186096 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ed47238-6d20-4920-9162-695e6ddcb090" containerID="d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306" exitCode=1 Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.186172 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerDied","Data":"d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306"} Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.186858 4886 scope.go:117] "RemoveContainer" containerID="d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.210731 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.227536 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.245351 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.267114 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.287361 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.309240 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.329954 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.358281 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.374816 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.402736 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.420618 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.420638 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.420908 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:13 crc kubenswrapper[4886]: E0314 08:30:13.420796 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:13 crc kubenswrapper[4886]: E0314 08:30:13.421037 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:13 crc kubenswrapper[4886]: E0314 08:30:13.421102 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.424706 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.444868 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.473093 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.493852 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.542872 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.564761 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:13 crc kubenswrapper[4886]: I0314 08:30:13.581274 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:13Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.196311 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/0.log" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.197371 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerStarted","Data":"ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143"} Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.213486 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.246100 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.263333 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.277439 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.297692 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.317435 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.336008 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.348912 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.362888 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.375976 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.391422 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.409872 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.419714 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:14 crc kubenswrapper[4886]: E0314 08:30:14.419921 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.427883 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.449451 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.464763 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.476376 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:14 crc kubenswrapper[4886]: I0314 08:30:14.497434 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:14Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.420828 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.420899 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:15 crc kubenswrapper[4886]: E0314 08:30:15.421007 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.420848 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:15 crc kubenswrapper[4886]: E0314 08:30:15.421159 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:15 crc kubenswrapper[4886]: E0314 08:30:15.421273 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.439885 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.459444 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.475082 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.490155 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: E0314 08:30:15.508996 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.514767 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.535943 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.552703 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.570842 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.584758 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.601818 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.617061 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.631626 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.648750 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.681586 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.704806 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.720087 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:15 crc kubenswrapper[4886]: I0314 08:30:15.742805 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:15Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:16 crc kubenswrapper[4886]: I0314 08:30:16.420960 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:16 crc kubenswrapper[4886]: E0314 08:30:16.421947 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.122348 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.122392 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.122403 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.122419 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.122429 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:17Z","lastTransitionTime":"2026-03-14T08:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.142204 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:17Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.146898 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.146954 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.146972 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.146998 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.147016 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:17Z","lastTransitionTime":"2026-03-14T08:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.167596 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:17Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.172708 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.172766 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.172785 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.172808 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.172824 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:17Z","lastTransitionTime":"2026-03-14T08:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.192853 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:17Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.197722 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.197941 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.198147 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.198357 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.198493 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:17Z","lastTransitionTime":"2026-03-14T08:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.219289 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:17Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.224738 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.225004 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.225179 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.225336 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.225473 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:17Z","lastTransitionTime":"2026-03-14T08:30:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.245734 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:17Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.246375 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.419645 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.419775 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.419986 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:17 crc kubenswrapper[4886]: I0314 08:30:17.420221 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.420514 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:17 crc kubenswrapper[4886]: E0314 08:30:17.420667 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:18 crc kubenswrapper[4886]: I0314 08:30:18.420682 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:18 crc kubenswrapper[4886]: E0314 08:30:18.420864 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:19 crc kubenswrapper[4886]: I0314 08:30:19.463173 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:19 crc kubenswrapper[4886]: E0314 08:30:19.463374 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:19 crc kubenswrapper[4886]: I0314 08:30:19.464055 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:19 crc kubenswrapper[4886]: E0314 08:30:19.464275 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:19 crc kubenswrapper[4886]: I0314 08:30:19.464033 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:19 crc kubenswrapper[4886]: E0314 08:30:19.464468 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:20 crc kubenswrapper[4886]: I0314 08:30:20.420410 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:20 crc kubenswrapper[4886]: E0314 08:30:20.420680 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:20 crc kubenswrapper[4886]: E0314 08:30:20.510424 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.381007 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.381225 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.381247 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.381218384 +0000 UTC m=+220.629670021 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.381296 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.381401 4886 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.381457 4886 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.381518 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.381487992 +0000 UTC m=+220.629939669 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.381590 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.381537034 +0000 UTC m=+220.629988711 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.420349 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.420523 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.420371 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.420351 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.420952 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.421065 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.437160 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.481820 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:21 crc kubenswrapper[4886]: I0314 08:30:21.481869 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.481975 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.481993 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.481992 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.482004 4886 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.482012 4886 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.482023 4886 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.482058 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.482044199 +0000 UTC m=+220.730495836 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:30:21 crc kubenswrapper[4886]: E0314 08:30:21.482071 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.48206643 +0000 UTC m=+220.730518067 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 14 08:30:22 crc kubenswrapper[4886]: I0314 08:30:22.419690 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:22 crc kubenswrapper[4886]: E0314 08:30:22.419913 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:23 crc kubenswrapper[4886]: I0314 08:30:23.420800 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:23 crc kubenswrapper[4886]: I0314 08:30:23.420875 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:23 crc kubenswrapper[4886]: E0314 08:30:23.420971 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:23 crc kubenswrapper[4886]: E0314 08:30:23.421093 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:23 crc kubenswrapper[4886]: I0314 08:30:23.421354 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:23 crc kubenswrapper[4886]: E0314 08:30:23.421474 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:24 crc kubenswrapper[4886]: I0314 08:30:24.420004 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:24 crc kubenswrapper[4886]: E0314 08:30:24.420402 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:24 crc kubenswrapper[4886]: I0314 08:30:24.422984 4886 scope.go:117] "RemoveContainer" containerID="d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.241507 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/2.log" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.244464 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.244996 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.258853 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.281407 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.294813 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.312019 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.328379 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.345542 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.366342 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.382977 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.403253 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.419686 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.420317 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.420340 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:25 crc kubenswrapper[4886]: E0314 08:30:25.420501 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:25 crc kubenswrapper[4886]: E0314 08:30:25.420662 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.420817 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:25 crc kubenswrapper[4886]: E0314 08:30:25.420919 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.435772 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.449627 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.459667 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.474565 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.487692 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.498837 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: E0314 08:30:25.511014 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.513167 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6761517-2ad7-4b84-b403-e3ab09d308df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.524704 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.539948 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.561235 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.581400 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.596052 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.610203 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.619892 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6761517-2ad7-4b84-b403-e3ab09d308df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.632211 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.647483 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.659937 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.671323 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.685253 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.696755 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.708523 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.717939 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.727323 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.739048 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.751319 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:25 crc kubenswrapper[4886]: I0314 08:30:25.769933 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:25Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.252523 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/3.log" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.253559 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/2.log" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.258292 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" exitCode=1 Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.258346 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.258405 4886 scope.go:117] "RemoveContainer" containerID="d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.259606 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:30:26 crc kubenswrapper[4886]: E0314 08:30:26.259992 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.283877 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.304246 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.333747 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.355293 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.371739 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.386217 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.401427 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6761517-2ad7-4b84-b403-e3ab09d308df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.415209 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.419894 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:26 crc kubenswrapper[4886]: E0314 08:30:26.420067 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.433023 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.451386 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.468066 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.483502 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.501230 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.521033 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.533756 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.556068 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.573197 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:26 crc kubenswrapper[4886]: I0314 08:30:26.590611 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e0ac395cca983946ba815573aa5ca76113acfde66743a9af6bb220a95bd799\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:29:54Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0314 08:29:54.436785 7131 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0314 08:29:54.436825 7131 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0314 08:29:54.436857 7131 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0314 08:29:54.436921 7131 factory.go:1336] Added *v1.Node event handler 7\\\\nI0314 08:29:54.436965 7131 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0314 08:29:54.437347 7131 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0314 08:29:54.437442 7131 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0314 08:29:54.437496 7131 ovnkube.go:599] Stopped ovnkube\\\\nI0314 08:29:54.437529 7131 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0314 08:29:54.437616 7131 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:25Z\\\",\\\"message\\\":\\\"bzkzj in node crc\\\\nI0314 08:30:25.557681 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557712 7442 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI0314 08:30:25.557723 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557472 7442 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:30:25.557742 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557759 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557776 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-dl247\\\\nF0314 08:30:25.557800 7442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:26Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.263901 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/3.log" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.268333 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.268676 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.292054 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.310089 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.337745 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.356858 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6761517-2ad7-4b84-b403-e3ab09d308df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.364577 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.364639 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.364660 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.364689 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.364711 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:27Z","lastTransitionTime":"2026-03-14T08:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.377971 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.385228 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.389942 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.389989 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.390009 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.390052 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.390074 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:27Z","lastTransitionTime":"2026-03-14T08:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.396755 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.409524 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.411967 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.415331 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.415392 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.415420 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.415461 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.415492 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:27Z","lastTransitionTime":"2026-03-14T08:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.420189 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.420258 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.420193 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.420373 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.420491 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.420725 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.431222 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.439666 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.444496 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.444558 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.444587 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.444689 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.444774 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:27Z","lastTransitionTime":"2026-03-14T08:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.450785 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.465746 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.466982 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.470493 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.470540 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.470559 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.470584 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.470599 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:27Z","lastTransitionTime":"2026-03-14T08:30:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.484881 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: E0314 08:30:27.485030 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.486330 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.504038 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.521439 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.547774 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.564831 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.582989 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.602360 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:27 crc kubenswrapper[4886]: I0314 08:30:27.635866 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:25Z\\\",\\\"message\\\":\\\"bzkzj in node crc\\\\nI0314 08:30:25.557681 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557712 7442 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI0314 08:30:25.557723 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557472 7442 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:30:25.557742 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557759 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557776 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-dl247\\\\nF0314 08:30:25.557800 7442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:30:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:27Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:28 crc kubenswrapper[4886]: I0314 08:30:28.420473 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:28 crc kubenswrapper[4886]: E0314 08:30:28.420760 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:29 crc kubenswrapper[4886]: I0314 08:30:29.420247 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:29 crc kubenswrapper[4886]: I0314 08:30:29.420394 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:29 crc kubenswrapper[4886]: E0314 08:30:29.420498 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:29 crc kubenswrapper[4886]: E0314 08:30:29.420667 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:29 crc kubenswrapper[4886]: I0314 08:30:29.420765 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:29 crc kubenswrapper[4886]: E0314 08:30:29.420991 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:30 crc kubenswrapper[4886]: I0314 08:30:30.420786 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:30 crc kubenswrapper[4886]: E0314 08:30:30.421167 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:30 crc kubenswrapper[4886]: E0314 08:30:30.512615 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:31 crc kubenswrapper[4886]: I0314 08:30:31.420019 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:31 crc kubenswrapper[4886]: E0314 08:30:31.421091 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:31 crc kubenswrapper[4886]: I0314 08:30:31.420400 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:31 crc kubenswrapper[4886]: I0314 08:30:31.420145 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:31 crc kubenswrapper[4886]: E0314 08:30:31.422642 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:31 crc kubenswrapper[4886]: E0314 08:30:31.422717 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:32 crc kubenswrapper[4886]: I0314 08:30:32.420223 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:32 crc kubenswrapper[4886]: E0314 08:30:32.420503 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:33 crc kubenswrapper[4886]: I0314 08:30:33.420546 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:33 crc kubenswrapper[4886]: I0314 08:30:33.420604 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:33 crc kubenswrapper[4886]: I0314 08:30:33.420629 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:33 crc kubenswrapper[4886]: E0314 08:30:33.420789 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:33 crc kubenswrapper[4886]: E0314 08:30:33.420939 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:33 crc kubenswrapper[4886]: E0314 08:30:33.421086 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:34 crc kubenswrapper[4886]: I0314 08:30:34.419850 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:34 crc kubenswrapper[4886]: E0314 08:30:34.420151 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.420214 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.420109 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:35 crc kubenswrapper[4886]: E0314 08:30:35.420423 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:35 crc kubenswrapper[4886]: E0314 08:30:35.420867 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.420942 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:35 crc kubenswrapper[4886]: E0314 08:30:35.421044 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.440888 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.460597 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.483565 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: E0314 08:30:35.513999 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.514475 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.534172 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.552079 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6761517-2ad7-4b84-b403-e3ab09d308df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.573037 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.592985 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.608426 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.624237 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.641951 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.666430 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.687026 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.710624 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.734259 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.754509 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.777662 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:35 crc kubenswrapper[4886]: I0314 08:30:35.804576 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:25Z\\\",\\\"message\\\":\\\"bzkzj in node crc\\\\nI0314 08:30:25.557681 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557712 7442 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI0314 08:30:25.557723 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557472 7442 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:30:25.557742 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557759 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557776 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-dl247\\\\nF0314 08:30:25.557800 7442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:30:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:35Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:36 crc kubenswrapper[4886]: I0314 08:30:36.420237 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:36 crc kubenswrapper[4886]: E0314 08:30:36.421053 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.420015 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.420095 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.420095 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.420380 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.420550 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.420670 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.716650 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.716740 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.716764 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.716794 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.716813 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:37Z","lastTransitionTime":"2026-03-14T08:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.742310 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.750168 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.750223 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.750241 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.750269 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.750287 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:37Z","lastTransitionTime":"2026-03-14T08:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.773643 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.778902 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.779008 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.779030 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.779064 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.779088 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:37Z","lastTransitionTime":"2026-03-14T08:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.802564 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.809157 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.809225 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.809239 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.809274 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.809298 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:37Z","lastTransitionTime":"2026-03-14T08:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.841628 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.847504 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.847550 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.847564 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.847594 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:37 crc kubenswrapper[4886]: I0314 08:30:37.847611 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:37Z","lastTransitionTime":"2026-03-14T08:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.864863 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:37Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:37 crc kubenswrapper[4886]: E0314 08:30:37.865040 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:30:38 crc kubenswrapper[4886]: I0314 08:30:38.420848 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:38 crc kubenswrapper[4886]: E0314 08:30:38.421197 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:39 crc kubenswrapper[4886]: I0314 08:30:39.420395 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:39 crc kubenswrapper[4886]: I0314 08:30:39.420460 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:39 crc kubenswrapper[4886]: I0314 08:30:39.420435 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:39 crc kubenswrapper[4886]: E0314 08:30:39.420647 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:39 crc kubenswrapper[4886]: E0314 08:30:39.420805 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:39 crc kubenswrapper[4886]: E0314 08:30:39.420877 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:40 crc kubenswrapper[4886]: I0314 08:30:40.420258 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:40 crc kubenswrapper[4886]: E0314 08:30:40.421157 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:40 crc kubenswrapper[4886]: I0314 08:30:40.421693 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:30:40 crc kubenswrapper[4886]: E0314 08:30:40.422065 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:30:40 crc kubenswrapper[4886]: E0314 08:30:40.515951 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:41 crc kubenswrapper[4886]: I0314 08:30:41.420371 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:41 crc kubenswrapper[4886]: E0314 08:30:41.420618 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:41 crc kubenswrapper[4886]: I0314 08:30:41.420700 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:41 crc kubenswrapper[4886]: I0314 08:30:41.420707 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:41 crc kubenswrapper[4886]: E0314 08:30:41.421612 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:41 crc kubenswrapper[4886]: E0314 08:30:41.422096 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:41 crc kubenswrapper[4886]: I0314 08:30:41.448505 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 14 08:30:42 crc kubenswrapper[4886]: I0314 08:30:42.420282 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:42 crc kubenswrapper[4886]: E0314 08:30:42.420539 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:42 crc kubenswrapper[4886]: I0314 08:30:42.950990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:42 crc kubenswrapper[4886]: E0314 08:30:42.951096 4886 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:30:42 crc kubenswrapper[4886]: E0314 08:30:42.951178 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs podName:842ea68a-b5ee-4b60-8e98-26e2ff72ae3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:46.951163241 +0000 UTC m=+242.199614878 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs") pod "network-metrics-daemon-hq6j4" (UID: "842ea68a-b5ee-4b60-8e98-26e2ff72ae3b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 14 08:30:43 crc kubenswrapper[4886]: I0314 08:30:43.420709 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:43 crc kubenswrapper[4886]: I0314 08:30:43.420782 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:43 crc kubenswrapper[4886]: I0314 08:30:43.420798 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:43 crc kubenswrapper[4886]: E0314 08:30:43.420912 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:43 crc kubenswrapper[4886]: E0314 08:30:43.421179 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:43 crc kubenswrapper[4886]: E0314 08:30:43.421343 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:44 crc kubenswrapper[4886]: I0314 08:30:44.420188 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:44 crc kubenswrapper[4886]: E0314 08:30:44.420334 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.420575 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.420676 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:45 crc kubenswrapper[4886]: E0314 08:30:45.420844 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.421098 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:45 crc kubenswrapper[4886]: E0314 08:30:45.421285 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:45 crc kubenswrapper[4886]: E0314 08:30:45.421403 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.431430 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30290a10-cfb2-4981-b885-384f20bea696\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2da7c75a54a62332bd7164f6ea728d36082c6621ae6fb25ef15118c54169d865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83839fa017d4176a126c208931a2b7f1aec20655ec9e3b3bd8ec0f86681d60a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8np2t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ftn4q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.444235 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1be6cd320ec0bcc6345a2b8f3f6f875c41d3cd75a9eee497bb12381cdc16dad1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7acedf8b4107671f9f478f631d8fa4addaa837b26ecffffb887cae22a2c78295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.457629 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.481490 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7df37466e5586325c561414b674337b6090e6eee2ac3d5f25ff706b81c76512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.503159 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dl247" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9532c0a-d4bd-4454-b521-bf157bf3707c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d5b299aa9cb642a5c353afe205bf525208ef031049a534fb168134b80e37dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ddd44ed6067cbea50204d2f7f05db900f13287ce351864f83cdaf1c72e91d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f61a17c8085410220346c1946ca67e018f995e59be0509e2aed708a17d5e202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f38a34a2c85238444f1ccb83668fbdbe3648e947f9be4e52693bda7800ca974d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08b05fb63b0e5bf2daa4a7b122151c0fef1522ced6f0e847fce3b6f4cfc9a533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aa52855fb76fd5ef27b37d29f3b560e93164777860db52e42d092e3f0546967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60f09154987642414ea7d6bf58813eab77fd09d11768ff6234de2b0c621c5df6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sntl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dl247\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.514424 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bzkzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f24061e7-5b3a-4dad-8ae1-c1b8d92e1ce5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://317ae74e8ce3a76caaf0a657a02fcbfaab927eed54822f633b43421e5ee51d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cx6l8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bzkzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: E0314 08:30:45.516472 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.532193 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e71a2a57-c243-43a0-acfb-5eb17f2b3ab7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25ad62f5c26cca64c63d0955b28f8aa8a76ea569ee64db70bc192aea76b1e871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7519ca7b24c6a8c7c3faa6393f0f02572f9b4ac417e262ddddd94f6297bc5f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22d9a1f47b0821cb4d452045a619d2fb867c15f70cd2093961a35f8989332473\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78f0b24729cadb444ece09a4f5e2bc7cc744cd77c9317e32fc4425848c79d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de39a08e4228994a97cd7fdd116b3c79fe5e7a4adf405cfd8103e7a9c37c1931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffd1d57d01e14b536e2f7e73749880714261d9b555a735783fbe83e3c916fa4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffd1d57d01e14b536e2f7e73749880714261d9b555a735783fbe83e3c916fa4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a0a4e940673ed3e084ff22ae46a7ce47eb7347e46128c22a2cd493f6ed06fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48a0a4e940673ed3e084ff22ae46a7ce47eb7347e46128c22a2cd493f6ed06fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c376b141f7ed448baed91333cedf7ad2f3f0128ab30b3dc13114b3846fbce3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c376b141f7ed448baed91333cedf7ad2f3f0128ab30b3dc13114b3846fbce3d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.543291 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.565056 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:25Z\\\",\\\"message\\\":\\\"bzkzj in node crc\\\\nI0314 08:30:25.557681 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557712 7442 ovnkube_controller.go:900] Cache entry expected pod with UID \\\\\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\\\\\" but failed to find it\\\\nI0314 08:30:25.557723 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557472 7442 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0314 08:30:25.557742 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0314 08:30:25.557759 7442 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0314 08:30:25.557776 7442 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-dl247\\\\nF0314 08:30:25.557800 7442 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:30:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jcw6x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ms4h7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.576563 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e02166ba-9748-4add-8980-e4d799092d18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:55Z\\\",\\\"message\\\":\\\"le observer\\\\nW0314 08:28:54.944711 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0314 08:28:54.944901 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0314 08:28:54.945598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3909543471/tls.crt::/tmp/serving-cert-3909543471/tls.key\\\\\\\"\\\\nI0314 08:28:55.367378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0314 08:28:55.368821 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0314 08:28:55.368839 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0314 08:28:55.368870 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0314 08:28:55.368877 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0314 08:28:55.372111 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0314 08:28:55.372156 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0314 08:28:55.372159 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372184 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0314 08:28:55.372189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0314 08:28:55.372193 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0314 08:28:55.372196 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0314 08:28:55.372200 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0314 08:28:55.373760 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:28:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.587298 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9273211c-e768-4ec1-b7be-640ef1c15309\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3343e31045d55d6c0d95699ceab0e0f8b7ece28ad7d1939eff43bef31470393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e89881366a76a4f6a38e496a1a72befde33c6038d07e39140440872840fcf97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36c665461cd0fe36545b5dd6fe8c5f7200e45459d7d7b4928bf9f56121934340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3d9ad062252428fa859befc2e4985d39ff19f047aaef19e0b32879348429f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.597803 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27240a1391f7ceb2925755d4b86eb0b9c918dffe10d66a2696c070936245fbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.608793 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5jrmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ed47238-6d20-4920-9162-695e6ddcb090\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-14T08:30:12Z\\\",\\\"message\\\":\\\"2026-03-14T08:29:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6\\\\n2026-03-14T08:29:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85f780a2-fec1-41a7-ab40-9322f99ab1f6 to /host/opt/cni/bin/\\\\n2026-03-14T08:29:27Z [verbose] multus-daemon started\\\\n2026-03-14T08:29:27Z [verbose] Readiness Indicator file check\\\\n2026-03-14T08:30:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:30:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lg8x7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5jrmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.618384 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2r5j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hq6j4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.627209 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6761517-2ad7-4b84-b403-e3ab09d308df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b07c0b768890c65b6c4e3f1d081c7c76930bc9be3aef51310096035232d1f01e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b87a94d5aca350491d8720123794c053c109e4966447f8a5836667a11c260fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-14T08:27:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.637841 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a534d3e7-7d84-4bd6-90b5-58ae29c666cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:27:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef593fe562e593507911df078ec0ac9712485cacadd01fb9025daf79980bccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c0cc373bc6498d0116c78eb0a9a7a08a6eca9c7d74f84e596595b9e40e1cde9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-14T08:28:13Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0314 08:27:47.400770 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0314 08:27:47.402713 1 observer_polling.go:159] Starting file observer\\\\nI0314 08:27:47.428430 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0314 08:27:47.434769 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0314 08:28:13.630244 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0314 08:28:13.630321 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f455dac3583e73c1949aacca37b8249ba31fb4c339814ee3f47df6d60118c749\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02f405b2541571b752e1dbe686e3aa7ff79d3e36211cc9a9876a92813737f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:27:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.649478 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.658438 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tzzd5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b55610ab-0fec-46ef-8233-8b0825013fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2836d64018cd0f1845971aaeca41a66e358279d4c64d6bac6e98202a248a12e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-szw6s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tzzd5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:45 crc kubenswrapper[4886]: I0314 08:30:45.667338 4886 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64517238-bfef-43e1-b543-1eea5b7f9c79\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-14T08:29:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f1a5e608eb0d29307ecf24378505fa017aa9905d9762f6f73751494abb60177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-14T08:29:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdn87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-14T08:29:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ddctv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:45Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:46 crc kubenswrapper[4886]: I0314 08:30:46.420295 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:46 crc kubenswrapper[4886]: E0314 08:30:46.420428 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:47 crc kubenswrapper[4886]: I0314 08:30:47.420230 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:47 crc kubenswrapper[4886]: I0314 08:30:47.420366 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:47 crc kubenswrapper[4886]: E0314 08:30:47.420602 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:47 crc kubenswrapper[4886]: E0314 08:30:47.420759 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:47 crc kubenswrapper[4886]: I0314 08:30:47.421309 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:47 crc kubenswrapper[4886]: E0314 08:30:47.421467 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.240068 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.240164 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.240184 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.240208 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.240225 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:48Z","lastTransitionTime":"2026-03-14T08:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.261048 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:48Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.266051 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.266099 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.266173 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.266201 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.266218 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:48Z","lastTransitionTime":"2026-03-14T08:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.287945 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:48Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.293743 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.293808 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.293830 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.293859 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.293886 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:48Z","lastTransitionTime":"2026-03-14T08:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.316211 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:48Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.321937 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.322221 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.322377 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.322525 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.322672 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:48Z","lastTransitionTime":"2026-03-14T08:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.344586 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:48Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.350872 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.350947 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.350965 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.350990 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.351009 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:48Z","lastTransitionTime":"2026-03-14T08:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.370021 4886 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-14T08:30:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5e41e15a-ef8c-4636-88a0-58cc60240a23\\\",\\\"systemUUID\\\":\\\"44063244-4752-49bf-ae05-9c5105dcb9bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-14T08:30:48Z is after 2025-08-24T17:21:41Z" Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.370347 4886 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 14 08:30:48 crc kubenswrapper[4886]: I0314 08:30:48.420529 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:48 crc kubenswrapper[4886]: E0314 08:30:48.420698 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:49 crc kubenswrapper[4886]: I0314 08:30:49.420006 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:49 crc kubenswrapper[4886]: E0314 08:30:49.420170 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:49 crc kubenswrapper[4886]: I0314 08:30:49.420258 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:49 crc kubenswrapper[4886]: E0314 08:30:49.420410 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:49 crc kubenswrapper[4886]: I0314 08:30:49.420466 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:49 crc kubenswrapper[4886]: E0314 08:30:49.420653 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:50 crc kubenswrapper[4886]: I0314 08:30:50.420390 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:50 crc kubenswrapper[4886]: E0314 08:30:50.420551 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:50 crc kubenswrapper[4886]: E0314 08:30:50.517955 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:51 crc kubenswrapper[4886]: I0314 08:30:51.420558 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:51 crc kubenswrapper[4886]: I0314 08:30:51.420588 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:51 crc kubenswrapper[4886]: E0314 08:30:51.420924 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:51 crc kubenswrapper[4886]: E0314 08:30:51.420733 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:51 crc kubenswrapper[4886]: I0314 08:30:51.420667 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:51 crc kubenswrapper[4886]: E0314 08:30:51.421009 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:52 crc kubenswrapper[4886]: I0314 08:30:52.420273 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:52 crc kubenswrapper[4886]: E0314 08:30:52.420745 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:53 crc kubenswrapper[4886]: I0314 08:30:53.420740 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:53 crc kubenswrapper[4886]: I0314 08:30:53.420773 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:53 crc kubenswrapper[4886]: E0314 08:30:53.421541 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:53 crc kubenswrapper[4886]: I0314 08:30:53.420785 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:53 crc kubenswrapper[4886]: E0314 08:30:53.421768 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:53 crc kubenswrapper[4886]: E0314 08:30:53.421661 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:54 crc kubenswrapper[4886]: I0314 08:30:54.420441 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:54 crc kubenswrapper[4886]: E0314 08:30:54.420746 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.419906 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.420020 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.421111 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:30:55 crc kubenswrapper[4886]: E0314 08:30:55.421393 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ms4h7_openshift-ovn-kubernetes(f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" Mar 14 08:30:55 crc kubenswrapper[4886]: E0314 08:30:55.422407 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:55 crc kubenswrapper[4886]: E0314 08:30:55.422644 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.422833 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:55 crc kubenswrapper[4886]: E0314 08:30:55.422957 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.457439 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bzkzj" podStartSLOduration=134.45741419 podStartE2EDuration="2m14.45741419s" podCreationTimestamp="2026-03-14 08:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.443047074 +0000 UTC m=+190.691498751" watchObservedRunningTime="2026-03-14 08:30:55.45741419 +0000 UTC m=+190.705865857" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.457692 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ftn4q" podStartSLOduration=133.457684448 podStartE2EDuration="2m13.457684448s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.456690918 +0000 UTC m=+190.705142595" watchObservedRunningTime="2026-03-14 08:30:55.457684448 +0000 UTC m=+190.706136115" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.517873 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dl247" podStartSLOduration=133.517856611 podStartE2EDuration="2m13.517856611s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.516882071 +0000 UTC m=+190.765333708" watchObservedRunningTime="2026-03-14 08:30:55.517856611 +0000 UTC m=+190.766308248" Mar 14 08:30:55 crc kubenswrapper[4886]: E0314 08:30:55.518337 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.553260 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=14.553240263 podStartE2EDuration="14.553240263s" podCreationTimestamp="2026-03-14 08:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.541351343 +0000 UTC m=+190.789802980" watchObservedRunningTime="2026-03-14 08:30:55.553240263 +0000 UTC m=+190.801691900" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.607362 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.607348222 podStartE2EDuration="1m29.607348222s" podCreationTimestamp="2026-03-14 08:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.596293607 +0000 UTC m=+190.844745244" watchObservedRunningTime="2026-03-14 08:30:55.607348222 +0000 UTC m=+190.855799859" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.618884 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=71.618876352 podStartE2EDuration="1m11.618876352s" podCreationTimestamp="2026-03-14 08:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.607743634 +0000 UTC m=+190.856195271" watchObservedRunningTime="2026-03-14 08:30:55.618876352 +0000 UTC m=+190.867327989" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.640343 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podStartSLOduration=133.640325471 podStartE2EDuration="2m13.640325471s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.63036073 +0000 UTC m=+190.878812357" watchObservedRunningTime="2026-03-14 08:30:55.640325471 +0000 UTC m=+190.888777108" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.649878 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5jrmb" podStartSLOduration=133.649867481 podStartE2EDuration="2m13.649867481s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.640826777 +0000 UTC m=+190.889278414" watchObservedRunningTime="2026-03-14 08:30:55.649867481 +0000 UTC m=+190.898319118" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.672348 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=34.672327331 podStartE2EDuration="34.672327331s" podCreationTimestamp="2026-03-14 08:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.658573754 +0000 UTC m=+190.907025391" watchObservedRunningTime="2026-03-14 08:30:55.672327331 +0000 UTC m=+190.920778978" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.686311 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=55.686291994 podStartE2EDuration="55.686291994s" podCreationTimestamp="2026-03-14 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.671790885 +0000 UTC m=+190.920242532" watchObservedRunningTime="2026-03-14 08:30:55.686291994 +0000 UTC m=+190.934743641" Mar 14 08:30:55 crc kubenswrapper[4886]: I0314 08:30:55.695579 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tzzd5" podStartSLOduration=134.695557055 podStartE2EDuration="2m14.695557055s" podCreationTimestamp="2026-03-14 08:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:55.695103601 +0000 UTC m=+190.943555248" watchObservedRunningTime="2026-03-14 08:30:55.695557055 +0000 UTC m=+190.944008692" Mar 14 08:30:56 crc kubenswrapper[4886]: I0314 08:30:56.419640 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:56 crc kubenswrapper[4886]: E0314 08:30:56.419782 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:57 crc kubenswrapper[4886]: I0314 08:30:57.420422 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:57 crc kubenswrapper[4886]: I0314 08:30:57.420422 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:57 crc kubenswrapper[4886]: E0314 08:30:57.420992 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:57 crc kubenswrapper[4886]: I0314 08:30:57.420458 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:57 crc kubenswrapper[4886]: E0314 08:30:57.421163 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:57 crc kubenswrapper[4886]: E0314 08:30:57.421225 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.420300 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:30:58 crc kubenswrapper[4886]: E0314 08:30:58.420429 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.629092 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.629192 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.629211 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.629245 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.629266 4886 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-14T08:30:58Z","lastTransitionTime":"2026-03-14T08:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.688478 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj"] Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.689098 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.690883 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.690889 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.692325 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.692557 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.813823 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b1c9148-d752-4311-bde5-0ab7e96a71fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.813894 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5b1c9148-d752-4311-bde5-0ab7e96a71fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.813924 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b1c9148-d752-4311-bde5-0ab7e96a71fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.813962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5b1c9148-d752-4311-bde5-0ab7e96a71fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.813997 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1c9148-d752-4311-bde5-0ab7e96a71fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915516 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5b1c9148-d752-4311-bde5-0ab7e96a71fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915557 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1c9148-d752-4311-bde5-0ab7e96a71fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915614 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b1c9148-d752-4311-bde5-0ab7e96a71fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915630 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b1c9148-d752-4311-bde5-0ab7e96a71fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915645 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5b1c9148-d752-4311-bde5-0ab7e96a71fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915642 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5b1c9148-d752-4311-bde5-0ab7e96a71fa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.915687 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5b1c9148-d752-4311-bde5-0ab7e96a71fa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.917256 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b1c9148-d752-4311-bde5-0ab7e96a71fa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.923337 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1c9148-d752-4311-bde5-0ab7e96a71fa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:58 crc kubenswrapper[4886]: I0314 08:30:58.947141 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b1c9148-d752-4311-bde5-0ab7e96a71fa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4gmnj\" (UID: \"5b1c9148-d752-4311-bde5-0ab7e96a71fa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.004914 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.385034 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" event={"ID":"5b1c9148-d752-4311-bde5-0ab7e96a71fa","Type":"ContainerStarted","Data":"606ae7efa3c4c8411f8ed3dea75323a450e33367f69801c595512e7994676966"} Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.385079 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" event={"ID":"5b1c9148-d752-4311-bde5-0ab7e96a71fa","Type":"ContainerStarted","Data":"26e2e74ec4f8149f5d13598f52bd6a58e3d0483c00f31713f5dc5abe625bdfa1"} Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.388191 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/1.log" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.388785 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/0.log" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.388849 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ed47238-6d20-4920-9162-695e6ddcb090" containerID="ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143" exitCode=1 Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.388894 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerDied","Data":"ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143"} Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.388940 4886 scope.go:117] "RemoveContainer" containerID="d9e5c0855a70fdf2973053bd7cc9fc5d7bcda6c914f52a13ac9032d0bb7cc306" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.390138 4886 scope.go:117] "RemoveContainer" containerID="ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143" Mar 14 08:30:59 crc kubenswrapper[4886]: E0314 08:30:59.390657 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5jrmb_openshift-multus(7ed47238-6d20-4920-9162-695e6ddcb090)\"" pod="openshift-multus/multus-5jrmb" podUID="7ed47238-6d20-4920-9162-695e6ddcb090" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.400622 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gmnj" podStartSLOduration=137.400594438 podStartE2EDuration="2m17.400594438s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:30:59.398458713 +0000 UTC m=+194.646910380" watchObservedRunningTime="2026-03-14 08:30:59.400594438 +0000 UTC m=+194.649046085" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.419883 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.419898 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:30:59 crc kubenswrapper[4886]: E0314 08:30:59.420013 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.420043 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:30:59 crc kubenswrapper[4886]: E0314 08:30:59.420158 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:30:59 crc kubenswrapper[4886]: E0314 08:30:59.420516 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.496611 4886 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 14 08:30:59 crc kubenswrapper[4886]: I0314 08:30:59.504086 4886 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 14 08:31:00 crc kubenswrapper[4886]: I0314 08:31:00.393256 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/1.log" Mar 14 08:31:00 crc kubenswrapper[4886]: I0314 08:31:00.420623 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:00 crc kubenswrapper[4886]: E0314 08:31:00.420836 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:00 crc kubenswrapper[4886]: E0314 08:31:00.519949 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:31:01 crc kubenswrapper[4886]: I0314 08:31:01.420418 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:01 crc kubenswrapper[4886]: I0314 08:31:01.420475 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:01 crc kubenswrapper[4886]: I0314 08:31:01.420418 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:01 crc kubenswrapper[4886]: E0314 08:31:01.420538 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:01 crc kubenswrapper[4886]: E0314 08:31:01.420651 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:01 crc kubenswrapper[4886]: E0314 08:31:01.420754 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:02 crc kubenswrapper[4886]: I0314 08:31:02.420681 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:02 crc kubenswrapper[4886]: E0314 08:31:02.421456 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:03 crc kubenswrapper[4886]: I0314 08:31:03.420501 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:03 crc kubenswrapper[4886]: I0314 08:31:03.420547 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:03 crc kubenswrapper[4886]: I0314 08:31:03.420574 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:03 crc kubenswrapper[4886]: E0314 08:31:03.420699 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:03 crc kubenswrapper[4886]: E0314 08:31:03.420890 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:03 crc kubenswrapper[4886]: E0314 08:31:03.421015 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:04 crc kubenswrapper[4886]: I0314 08:31:04.420232 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:04 crc kubenswrapper[4886]: E0314 08:31:04.420431 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:05 crc kubenswrapper[4886]: I0314 08:31:05.420084 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:05 crc kubenswrapper[4886]: I0314 08:31:05.420160 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:05 crc kubenswrapper[4886]: I0314 08:31:05.420284 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:05 crc kubenswrapper[4886]: E0314 08:31:05.422614 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:05 crc kubenswrapper[4886]: E0314 08:31:05.422757 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:05 crc kubenswrapper[4886]: E0314 08:31:05.422904 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:05 crc kubenswrapper[4886]: E0314 08:31:05.520867 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:31:06 crc kubenswrapper[4886]: I0314 08:31:06.420637 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:06 crc kubenswrapper[4886]: E0314 08:31:06.420822 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:07 crc kubenswrapper[4886]: I0314 08:31:07.419898 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:07 crc kubenswrapper[4886]: I0314 08:31:07.420064 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:07 crc kubenswrapper[4886]: E0314 08:31:07.420203 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:07 crc kubenswrapper[4886]: E0314 08:31:07.420355 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:07 crc kubenswrapper[4886]: I0314 08:31:07.420701 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:07 crc kubenswrapper[4886]: E0314 08:31:07.420881 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:08 crc kubenswrapper[4886]: I0314 08:31:08.419785 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:08 crc kubenswrapper[4886]: E0314 08:31:08.420050 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:09 crc kubenswrapper[4886]: I0314 08:31:09.420943 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:09 crc kubenswrapper[4886]: I0314 08:31:09.421055 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:09 crc kubenswrapper[4886]: E0314 08:31:09.421248 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:09 crc kubenswrapper[4886]: I0314 08:31:09.421433 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:09 crc kubenswrapper[4886]: E0314 08:31:09.421485 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:09 crc kubenswrapper[4886]: E0314 08:31:09.421524 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:10 crc kubenswrapper[4886]: I0314 08:31:10.419753 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:10 crc kubenswrapper[4886]: E0314 08:31:10.419972 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:10 crc kubenswrapper[4886]: I0314 08:31:10.420093 4886 scope.go:117] "RemoveContainer" containerID="ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143" Mar 14 08:31:10 crc kubenswrapper[4886]: I0314 08:31:10.423505 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:31:10 crc kubenswrapper[4886]: E0314 08:31:10.522061 4886 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.285909 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hq6j4"] Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.286018 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:11 crc kubenswrapper[4886]: E0314 08:31:11.286155 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.420338 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.420395 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.420355 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:11 crc kubenswrapper[4886]: E0314 08:31:11.420529 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:11 crc kubenswrapper[4886]: E0314 08:31:11.420602 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:11 crc kubenswrapper[4886]: E0314 08:31:11.420668 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.438104 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/1.log" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.438328 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerStarted","Data":"06df76c665731f9128bc5d02002f446380ee3b3057ff32d99a3164b686de1ae1"} Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.441394 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/3.log" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.446595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerStarted","Data":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.447286 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:31:11 crc kubenswrapper[4886]: I0314 08:31:11.483984 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podStartSLOduration=149.483968695 podStartE2EDuration="2m29.483968695s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:11.483182623 +0000 UTC m=+206.731634270" watchObservedRunningTime="2026-03-14 08:31:11.483968695 +0000 UTC m=+206.732420332" Mar 14 08:31:13 crc kubenswrapper[4886]: I0314 08:31:13.421229 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:13 crc kubenswrapper[4886]: E0314 08:31:13.421634 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:13 crc kubenswrapper[4886]: I0314 08:31:13.421698 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:13 crc kubenswrapper[4886]: I0314 08:31:13.421808 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:13 crc kubenswrapper[4886]: I0314 08:31:13.421926 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:13 crc kubenswrapper[4886]: E0314 08:31:13.421945 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:13 crc kubenswrapper[4886]: E0314 08:31:13.422115 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:13 crc kubenswrapper[4886]: E0314 08:31:13.422323 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:15 crc kubenswrapper[4886]: I0314 08:31:15.420342 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:15 crc kubenswrapper[4886]: I0314 08:31:15.420375 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:15 crc kubenswrapper[4886]: I0314 08:31:15.420342 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:15 crc kubenswrapper[4886]: E0314 08:31:15.422445 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 14 08:31:15 crc kubenswrapper[4886]: I0314 08:31:15.422537 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:15 crc kubenswrapper[4886]: E0314 08:31:15.422583 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 14 08:31:15 crc kubenswrapper[4886]: E0314 08:31:15.422739 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hq6j4" podUID="842ea68a-b5ee-4b60-8e98-26e2ff72ae3b" Mar 14 08:31:15 crc kubenswrapper[4886]: E0314 08:31:15.422883 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.420771 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.420897 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.420919 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.420834 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.425497 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.427520 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.428010 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.428249 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.430911 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 08:31:17 crc kubenswrapper[4886]: I0314 08:31:17.436226 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.849872 4886 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.906357 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dq2b2"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.907240 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gdqjc"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.907988 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.908482 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.911377 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r4qzm"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.912281 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.913407 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.914010 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.917687 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jm5tw"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.918708 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.921547 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.923324 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.930863 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.931663 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.932211 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.938916 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.939629 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.940923 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941180 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941640 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-config\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941702 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8107785d-acc9-4fdf-8f93-21f2b4a62c61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941750 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-audit-policies\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941812 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-etcd-serving-ca\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941848 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8107785d-acc9-4fdf-8f93-21f2b4a62c61-config\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-serving-cert\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941913 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941965 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1744255-6bdc-4c0d-9f1e-70119127e182-audit-dir\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.941995 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c55450-dcee-4aee-8153-9ea2ff49b659-serving-cert\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942029 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942057 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-encryption-config\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942084 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-client-ca\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942154 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-serving-cert\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942190 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-etcd-client\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942220 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzg52\" (UniqueName: \"kubernetes.io/projected/14c55450-dcee-4aee-8153-9ea2ff49b659-kube-api-access-pzg52\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942254 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942462 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942253 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klqx4\" (UniqueName: \"kubernetes.io/projected/8107785d-acc9-4fdf-8f93-21f2b4a62c61-kube-api-access-klqx4\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942664 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942716 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-image-import-ca\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942753 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-audit-dir\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942766 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942793 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7b6w\" (UniqueName: \"kubernetes.io/projected/c1744255-6bdc-4c0d-9f1e-70119127e182-kube-api-access-h7b6w\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942845 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfbxq\" (UniqueName: \"kubernetes.io/projected/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-kube-api-access-qfbxq\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942890 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8107785d-acc9-4fdf-8f93-21f2b4a62c61-images\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942930 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-audit\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942963 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-etcd-client\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943004 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943055 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-node-pullsecrets\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943172 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-encryption-config\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943211 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-config\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.942719 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t996r"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943473 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943616 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943883 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943924 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.943955 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.944064 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.944075 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.944184 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.950651 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.951177 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.952245 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.965137 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.969395 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lf5vf"] Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.970267 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.970274 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.971082 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.971889 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.973595 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.974079 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.976471 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.977687 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.979165 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.979390 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.979675 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.980454 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.980760 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.979184 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.981325 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.981456 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.986434 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.986744 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987329 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987447 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987463 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987639 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987651 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987718 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987792 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.987843 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.992648 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.992809 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.992833 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.993192 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.993347 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.995685 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.995983 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.996279 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.996459 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 08:31:18 crc kubenswrapper[4886]: I0314 08:31:18.996672 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.001022 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.001452 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.001595 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.001685 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.001869 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.002056 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.002215 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.002828 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.002994 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.003234 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.003997 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.004088 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.004096 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.004143 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.004190 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.005175 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.009031 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.009432 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.009641 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010210 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010409 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010588 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010629 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010675 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010702 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2sj7h"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010590 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010775 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010895 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.011088 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.010959 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.013389 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wmcc2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.013848 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.014595 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.014750 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j46q2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.015614 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.015706 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.016238 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.017897 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.018710 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.019451 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.023260 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.023767 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.024273 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.024846 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.036314 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.037741 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.037922 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038090 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038295 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038390 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038012 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038633 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038818 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.038922 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.039185 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.041149 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.042835 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.043020 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.043190 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.043337 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.043507 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.044362 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.044382 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.044703 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.046975 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047025 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-oauth-config\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047174 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-config\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047277 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8107785d-acc9-4fdf-8f93-21f2b4a62c61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047313 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-config\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047344 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047443 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8lqwb"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047374 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8def3c-80e8-4f81-8518-202af1613e6f-serving-cert\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047741 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-audit-policies\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047768 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047795 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-config\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.047823 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.049093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.051612 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.051863 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-config\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052080 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-etcd-serving-ca\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052108 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-audit-policies\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8352432-8b6e-4a89-b830-379796727237-serving-cert\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052512 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052637 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2778f200-cefa-4b41-9bc5-f600415f2387-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gm27b\" (UID: \"2778f200-cefa-4b41-9bc5-f600415f2387\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052759 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b85e17-fe41-4ae4-8e77-a3654421f751-config\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052820 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8107785d-acc9-4fdf-8f93-21f2b4a62c61-config\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052856 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kk4w\" (UniqueName: \"kubernetes.io/projected/37d01daf-edd5-4dd2-8a4a-40165f8d0275-kube-api-access-5kk4w\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052931 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.052975 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-serving-cert\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.053001 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0228f0fd-9323-456c-9291-6150db291cf4-trusted-ca\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.054176 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-etcd-serving-ca\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.053440 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-serving-cert\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.054257 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-config\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.055043 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8107785d-acc9-4fdf-8f93-21f2b4a62c61-config\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.055177 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-dir\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.055835 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0228f0fd-9323-456c-9291-6150db291cf4-config\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.055906 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b85e17-fe41-4ae4-8e77-a3654421f751-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.056014 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1744255-6bdc-4c0d-9f1e-70119127e182-audit-dir\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.056095 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c55450-dcee-4aee-8153-9ea2ff49b659-serving-cert\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.056183 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km96w\" (UniqueName: \"kubernetes.io/projected/8b226bf0-ae7d-435b-9470-70dfb371f38e-kube-api-access-km96w\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.056213 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1744255-6bdc-4c0d-9f1e-70119127e182-audit-dir\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.056869 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071398 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w462w\" (UniqueName: \"kubernetes.io/projected/2778f200-cefa-4b41-9bc5-f600415f2387-kube-api-access-w462w\") pod \"cluster-samples-operator-665b6dd947-gm27b\" (UID: \"2778f200-cefa-4b41-9bc5-f600415f2387\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071484 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071506 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-serving-cert\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071522 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-encryption-config\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071539 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-client-ca\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071575 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-console-config\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071591 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-etcd-client\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071607 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzg52\" (UniqueName: \"kubernetes.io/projected/14c55450-dcee-4aee-8153-9ea2ff49b659-kube-api-access-pzg52\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071625 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-oauth-serving-cert\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071643 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-service-ca\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071664 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-policies\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071681 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071699 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9270f20e-8365-43e2-9d4b-b067780c0804-serving-cert\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071719 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg5n\" (UniqueName: \"kubernetes.io/projected/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-kube-api-access-fgg5n\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071736 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kg7k\" (UniqueName: \"kubernetes.io/projected/5a8def3c-80e8-4f81-8518-202af1613e6f-kube-api-access-4kg7k\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071761 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klqx4\" (UniqueName: \"kubernetes.io/projected/8107785d-acc9-4fdf-8f93-21f2b4a62c61-kube-api-access-klqx4\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071797 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071818 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071839 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-image-import-ca\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071859 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071875 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54b85e17-fe41-4ae4-8e77-a3654421f751-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071895 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwn94\" (UniqueName: \"kubernetes.io/projected/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-kube-api-access-bwn94\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-audit-dir\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.071964 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072105 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-client-ca\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072165 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0228f0fd-9323-456c-9291-6150db291cf4-serving-cert\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072189 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl4k6\" (UniqueName: \"kubernetes.io/projected/b5fc2483-aab1-4487-a0ad-b6c3183826a5-kube-api-access-kl4k6\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072213 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzm9\" (UniqueName: \"kubernetes.io/projected/0228f0fd-9323-456c-9291-6150db291cf4-kube-api-access-fwzm9\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072242 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7b6w\" (UniqueName: \"kubernetes.io/projected/c1744255-6bdc-4c0d-9f1e-70119127e182-kube-api-access-h7b6w\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072266 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-service-ca-bundle\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072288 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-machine-approver-tls\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072303 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fc2483-aab1-4487-a0ad-b6c3183826a5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072319 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a8352432-8b6e-4a89-b830-379796727237-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072338 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8mk2\" (UniqueName: \"kubernetes.io/projected/5d14f041-e1b8-4b93-a893-946dbecf44aa-kube-api-access-w8mk2\") pod \"dns-operator-744455d44c-j46q2\" (UID: \"5d14f041-e1b8-4b93-a893-946dbecf44aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072361 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfbxq\" (UniqueName: \"kubernetes.io/projected/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-kube-api-access-qfbxq\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072377 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072400 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8107785d-acc9-4fdf-8f93-21f2b4a62c61-images\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072416 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5fc2483-aab1-4487-a0ad-b6c3183826a5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072435 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4jmr\" (UniqueName: \"kubernetes.io/projected/a312fb44-823b-44ec-8312-0d83b990e9cd-kube-api-access-t4jmr\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072450 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-client\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072469 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072487 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-audit\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072529 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d01daf-edd5-4dd2-8a4a-40165f8d0275-serving-cert\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072545 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-auth-proxy-config\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btr5r\" (UniqueName: \"kubernetes.io/projected/9270f20e-8365-43e2-9d4b-b067780c0804-kube-api-access-btr5r\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072583 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bh6s\" (UniqueName: \"kubernetes.io/projected/a8352432-8b6e-4a89-b830-379796727237-kube-api-access-8bh6s\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072620 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-etcd-client\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072636 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-trusted-ca-bundle\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072652 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072669 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d14f041-e1b8-4b93-a893-946dbecf44aa-metrics-tls\") pod \"dns-operator-744455d44c-j46q2\" (UID: \"5d14f041-e1b8-4b93-a893-946dbecf44aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072686 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-config\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072703 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072721 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-node-pullsecrets\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072736 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-client-ca\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072743 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-config\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072764 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-service-ca\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072791 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-encryption-config\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072812 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-ca\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072867 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.072959 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-audit-dir\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.073626 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.073903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8107785d-acc9-4fdf-8f93-21f2b4a62c61-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.074210 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-serving-cert\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.074409 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-image-import-ca\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.074476 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hznhw"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.074897 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.075856 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c55450-dcee-4aee-8153-9ea2ff49b659-serving-cert\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.075863 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8107785d-acc9-4fdf-8f93-21f2b4a62c61-images\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.075964 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.076220 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1744255-6bdc-4c0d-9f1e-70119127e182-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.076420 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.076445 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-audit\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.076608 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.077153 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.077302 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.077409 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.077683 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-node-pullsecrets\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.078147 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.078266 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-config\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.078744 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.079785 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.079875 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-serving-cert\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.079997 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.080608 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.081257 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.081933 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.082145 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5984c"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.082681 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.083233 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.083601 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.083637 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-etcd-client\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.084312 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wscrd"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.084319 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-etcd-client\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.084495 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1744255-6bdc-4c0d-9f1e-70119127e182-encryption-config\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.084759 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.085582 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.086102 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.086485 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.086829 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8lxpz"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.087455 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.088067 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.088889 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.089567 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tfh9h"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.089922 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.091529 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-encryption-config\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.096022 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.096164 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.096559 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.096851 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.097003 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.097077 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.097218 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.102773 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.103719 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.106772 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557950-fd7np"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.107586 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.109871 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.110841 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.117381 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k7qk8"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.117972 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.119104 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-d48xp"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.119722 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.121755 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t996r"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.124180 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dq2b2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.124510 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2sj7h"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.130447 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.134098 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c9qzb"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.134965 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.136636 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r4qzm"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.137811 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.141492 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wscrd"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.145176 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.147085 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j46q2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.148804 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gdqjc"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.149905 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.150642 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.152365 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.154523 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.156304 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wmcc2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.157452 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.161504 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.168326 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5984c"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.170292 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.170976 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.172267 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.173609 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-config\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.173737 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-dir\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.173836 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0228f0fd-9323-456c-9291-6150db291cf4-config\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.173932 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b85e17-fe41-4ae4-8e77-a3654421f751-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174040 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794q4\" (UniqueName: \"kubernetes.io/projected/5fb33ac2-d4aa-49b0-9007-33af11834a96-kube-api-access-794q4\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174137 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-dir\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174254 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km96w\" (UniqueName: \"kubernetes.io/projected/8b226bf0-ae7d-435b-9470-70dfb371f38e-kube-api-access-km96w\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174346 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w462w\" (UniqueName: \"kubernetes.io/projected/2778f200-cefa-4b41-9bc5-f600415f2387-kube-api-access-w462w\") pod \"cluster-samples-operator-665b6dd947-gm27b\" (UID: \"2778f200-cefa-4b41-9bc5-f600415f2387\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174470 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-default-certificate\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174581 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-console-config\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174670 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-webhook-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174770 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-oauth-serving-cert\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174860 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-service-ca\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174963 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg5n\" (UniqueName: \"kubernetes.io/projected/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-kube-api-access-fgg5n\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175052 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-policies\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175240 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0228f0fd-9323-456c-9291-6150db291cf4-config\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175340 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9270f20e-8365-43e2-9d4b-b067780c0804-serving-cert\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175441 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175535 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kg7k\" (UniqueName: \"kubernetes.io/projected/5a8def3c-80e8-4f81-8518-202af1613e6f-kube-api-access-4kg7k\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175660 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175772 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwn94\" (UniqueName: \"kubernetes.io/projected/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-kube-api-access-bwn94\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175894 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.175970 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54b85e17-fe41-4ae4-8e77-a3654421f751-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176042 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/094b2153-374f-4595-ae06-2655b16d69b9-tmpfs\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176135 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176219 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-client-ca\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176292 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0228f0fd-9323-456c-9291-6150db291cf4-serving-cert\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176363 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-service-ca-bundle\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176429 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl4k6\" (UniqueName: \"kubernetes.io/projected/b5fc2483-aab1-4487-a0ad-b6c3183826a5-kube-api-access-kl4k6\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176494 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzm9\" (UniqueName: \"kubernetes.io/projected/0228f0fd-9323-456c-9291-6150db291cf4-kube-api-access-fwzm9\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176559 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-metrics-certs\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176736 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a8352432-8b6e-4a89-b830-379796727237-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176811 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-machine-approver-tls\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.176908 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fc2483-aab1-4487-a0ad-b6c3183826a5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177003 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8mk2\" (UniqueName: \"kubernetes.io/projected/5d14f041-e1b8-4b93-a893-946dbecf44aa-kube-api-access-w8mk2\") pod \"dns-operator-744455d44c-j46q2\" (UID: \"5d14f041-e1b8-4b93-a893-946dbecf44aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177112 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177229 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5fc2483-aab1-4487-a0ad-b6c3183826a5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177323 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177415 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4jmr\" (UniqueName: \"kubernetes.io/projected/a312fb44-823b-44ec-8312-0d83b990e9cd-kube-api-access-t4jmr\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177509 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-client\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177637 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177802 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7d22\" (UniqueName: \"kubernetes.io/projected/094b2153-374f-4595-ae06-2655b16d69b9-kube-api-access-q7d22\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177913 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d01daf-edd5-4dd2-8a4a-40165f8d0275-serving-cert\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178025 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-auth-proxy-config\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178142 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btr5r\" (UniqueName: \"kubernetes.io/projected/9270f20e-8365-43e2-9d4b-b067780c0804-kube-api-access-btr5r\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178306 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bh6s\" (UniqueName: \"kubernetes.io/projected/a8352432-8b6e-4a89-b830-379796727237-kube-api-access-8bh6s\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178369 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-stats-auth\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178420 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-trusted-ca-bundle\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178442 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178464 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178485 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d14f041-e1b8-4b93-a893-946dbecf44aa-metrics-tls\") pod \"dns-operator-744455d44c-j46q2\" (UID: \"5d14f041-e1b8-4b93-a893-946dbecf44aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178506 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-config\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178531 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-service-ca\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178551 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fb33ac2-d4aa-49b0-9007-33af11834a96-service-ca-bundle\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178581 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-ca\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178613 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-oauth-config\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-service-ca\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178652 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178680 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-config\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178707 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178727 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8def3c-80e8-4f81-8518-202af1613e6f-serving-cert\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178747 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlf7\" (UniqueName: \"kubernetes.io/projected/feeab5ae-f3ec-4590-8625-00e98fb5064b-kube-api-access-5rlf7\") pod \"downloads-7954f5f757-hznhw\" (UID: \"feeab5ae-f3ec-4590-8625-00e98fb5064b\") " pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178767 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178783 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-config\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178804 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178821 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8352432-8b6e-4a89-b830-379796727237-serving-cert\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178840 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178861 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kk4w\" (UniqueName: \"kubernetes.io/projected/37d01daf-edd5-4dd2-8a4a-40165f8d0275-kube-api-access-5kk4w\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178878 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2778f200-cefa-4b41-9bc5-f600415f2387-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gm27b\" (UID: \"2778f200-cefa-4b41-9bc5-f600415f2387\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178896 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b85e17-fe41-4ae4-8e77-a3654421f751-config\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178916 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-serving-cert\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178933 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0228f0fd-9323-456c-9291-6150db291cf4-trusted-ca\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178966 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-service-ca-bundle\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.179716 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-client-ca\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.180241 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a8352432-8b6e-4a89-b830-379796727237-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.180305 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0228f0fd-9323-456c-9291-6150db291cf4-trusted-ca\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.181033 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.181260 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-trusted-ca-bundle\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.181707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9270f20e-8365-43e2-9d4b-b067780c0804-serving-cert\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.181710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.182236 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-policies\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.181768 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.177519 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-console-config\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.182634 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5fc2483-aab1-4487-a0ad-b6c3183826a5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.182671 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.182745 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0228f0fd-9323-456c-9291-6150db291cf4-serving-cert\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.183091 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.183611 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.183623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-machine-approver-tls\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174615 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-config\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.174071 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.185429 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vdd8t"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.185821 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-config\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.185306 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.185854 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-ca\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.184108 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9270f20e-8365-43e2-9d4b-b067780c0804-config\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.178040 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-oauth-serving-cert\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186604 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qj8pq"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-service-ca\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186825 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-auth-proxy-config\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.184617 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186891 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-config\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186963 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lf5vf"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186984 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.186996 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187005 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jm5tw"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187013 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187071 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37d01daf-edd5-4dd2-8a4a-40165f8d0275-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187111 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187365 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tfh9h"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187568 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fc2483-aab1-4487-a0ad-b6c3183826a5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.187836 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8def3c-80e8-4f81-8518-202af1613e6f-serving-cert\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.188464 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d14f041-e1b8-4b93-a893-946dbecf44aa-metrics-tls\") pod \"dns-operator-744455d44c-j46q2\" (UID: \"5d14f041-e1b8-4b93-a893-946dbecf44aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.188801 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-oauth-config\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.189159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.189209 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qj8pq"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.189627 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.189756 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.189947 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.190351 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8352432-8b6e-4a89-b830-379796727237-serving-cert\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.190770 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.190818 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.191070 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.191081 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-serving-cert\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.191455 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9270f20e-8365-43e2-9d4b-b067780c0804-etcd-client\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.191555 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d01daf-edd5-4dd2-8a4a-40165f8d0275-serving-cert\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.192920 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.194033 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2778f200-cefa-4b41-9bc5-f600415f2387-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gm27b\" (UID: \"2778f200-cefa-4b41-9bc5-f600415f2387\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.194302 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-d48xp"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.196343 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-fd7np"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.197460 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hznhw"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.198743 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vdd8t"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.199755 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c9qzb"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.200746 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8lxpz"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.201771 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.202792 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k7qk8"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.203906 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.205100 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.206097 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.207199 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xcvj2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.207718 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.209906 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.230891 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.249674 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.257935 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54b85e17-fe41-4ae4-8e77-a3654421f751-config\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.270063 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.277551 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54b85e17-fe41-4ae4-8e77-a3654421f751-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279529 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-metrics-certs\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279596 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279620 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7d22\" (UniqueName: \"kubernetes.io/projected/094b2153-374f-4595-ae06-2655b16d69b9-kube-api-access-q7d22\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279664 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-stats-auth\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279689 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fb33ac2-d4aa-49b0-9007-33af11834a96-service-ca-bundle\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279742 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlf7\" (UniqueName: \"kubernetes.io/projected/feeab5ae-f3ec-4590-8625-00e98fb5064b-kube-api-access-5rlf7\") pod \"downloads-7954f5f757-hznhw\" (UID: \"feeab5ae-f3ec-4590-8625-00e98fb5064b\") " pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279782 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-794q4\" (UniqueName: \"kubernetes.io/projected/5fb33ac2-d4aa-49b0-9007-33af11834a96-kube-api-access-794q4\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279832 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-default-certificate\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279873 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-webhook-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.279944 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/094b2153-374f-4595-ae06-2655b16d69b9-tmpfs\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.280512 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/094b2153-374f-4595-ae06-2655b16d69b9-tmpfs\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.290712 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.309832 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.330387 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.350474 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.375616 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.389966 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.410828 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.444229 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzg52\" (UniqueName: \"kubernetes.io/projected/14c55450-dcee-4aee-8153-9ea2ff49b659-kube-api-access-pzg52\") pod \"controller-manager-879f6c89f-r4qzm\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.465329 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klqx4\" (UniqueName: \"kubernetes.io/projected/8107785d-acc9-4fdf-8f93-21f2b4a62c61-kube-api-access-klqx4\") pod \"machine-api-operator-5694c8668f-dq2b2\" (UID: \"8107785d-acc9-4fdf-8f93-21f2b4a62c61\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.484257 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7b6w\" (UniqueName: \"kubernetes.io/projected/c1744255-6bdc-4c0d-9f1e-70119127e182-kube-api-access-h7b6w\") pod \"apiserver-7bbb656c7d-j5zgg\" (UID: \"c1744255-6bdc-4c0d-9f1e-70119127e182\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.503753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfbxq\" (UniqueName: \"kubernetes.io/projected/ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119-kube-api-access-qfbxq\") pod \"apiserver-76f77b778f-gdqjc\" (UID: \"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119\") " pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.509901 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.530374 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.550310 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.564683 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-default-certificate\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.570642 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.580741 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.585398 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-stats-auth\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.593145 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.603328 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb33ac2-d4aa-49b0-9007-33af11834a96-metrics-certs\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.611287 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.613334 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.622764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fb33ac2-d4aa-49b0-9007-33af11834a96-service-ca-bundle\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.631048 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.639670 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.666461 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.668237 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.670815 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.690713 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.711417 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.731452 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.758233 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.771091 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.792005 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.811291 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.831301 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.870246 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.890696 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.910024 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.932928 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.937889 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dq2b2"] Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.951365 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.970590 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 08:31:19 crc kubenswrapper[4886]: I0314 08:31:19.990914 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.010375 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.028699 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gdqjc"] Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.030520 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.050616 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.072680 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.089298 4886 request.go:700] Waited for 1.005525542s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.090755 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.100196 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg"] Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.110367 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.111244 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r4qzm"] Mar 14 08:31:20 crc kubenswrapper[4886]: W0314 08:31:20.121575 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1744255_6bdc_4c0d_9f1e_70119127e182.slice/crio-e20edddd188e80f887ff0afc9866462d0930d6f276657c59f37f1e8da1b7b82d WatchSource:0}: Error finding container e20edddd188e80f887ff0afc9866462d0930d6f276657c59f37f1e8da1b7b82d: Status 404 returned error can't find the container with id e20edddd188e80f887ff0afc9866462d0930d6f276657c59f37f1e8da1b7b82d Mar 14 08:31:20 crc kubenswrapper[4886]: W0314 08:31:20.122998 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14c55450_dcee_4aee_8153_9ea2ff49b659.slice/crio-f5f8d949c25ff10ebe26e5d396230a24ae728c8917ef0f16f67b40064b20aabf WatchSource:0}: Error finding container f5f8d949c25ff10ebe26e5d396230a24ae728c8917ef0f16f67b40064b20aabf: Status 404 returned error can't find the container with id f5f8d949c25ff10ebe26e5d396230a24ae728c8917ef0f16f67b40064b20aabf Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.129758 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.150737 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.171172 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.191890 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.210152 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.230582 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.249778 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.271191 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: E0314 08:31:20.280397 4886 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 14 08:31:20 crc kubenswrapper[4886]: E0314 08:31:20.280423 4886 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 14 08:31:20 crc kubenswrapper[4886]: E0314 08:31:20.280461 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-webhook-cert podName:094b2153-374f-4595-ae06-2655b16d69b9 nodeName:}" failed. No retries permitted until 2026-03-14 08:31:20.780442924 +0000 UTC m=+216.028894561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-webhook-cert") pod "packageserver-d55dfcdfc-knppr" (UID: "094b2153-374f-4595-ae06-2655b16d69b9") : failed to sync secret cache: timed out waiting for the condition Mar 14 08:31:20 crc kubenswrapper[4886]: E0314 08:31:20.280496 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-apiservice-cert podName:094b2153-374f-4595-ae06-2655b16d69b9 nodeName:}" failed. No retries permitted until 2026-03-14 08:31:20.780475385 +0000 UTC m=+216.028927022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-apiservice-cert") pod "packageserver-d55dfcdfc-knppr" (UID: "094b2153-374f-4595-ae06-2655b16d69b9") : failed to sync secret cache: timed out waiting for the condition Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.289768 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.315075 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.330139 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.350639 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.371446 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.390290 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.410967 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.433903 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.450910 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.470672 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.489981 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.510640 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.517722 4886 generic.go:334] "Generic (PLEG): container finished" podID="ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119" containerID="4cd369edfb7c722478fde144b2a6e3297ad22073c43fb7c16d2085ac7afb96bc" exitCode=0 Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.517799 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" event={"ID":"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119","Type":"ContainerDied","Data":"4cd369edfb7c722478fde144b2a6e3297ad22073c43fb7c16d2085ac7afb96bc"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.517852 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" event={"ID":"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119","Type":"ContainerStarted","Data":"ee8c3a76bb4676931908c7d3c08a091f6e66aa901e11550083e90192809dba6e"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.518983 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" event={"ID":"14c55450-dcee-4aee-8153-9ea2ff49b659","Type":"ContainerStarted","Data":"765006db73285273cab077f8ba3ecf43432c0a444fde9b6e1a8b11a52d394487"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.519015 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" event={"ID":"14c55450-dcee-4aee-8153-9ea2ff49b659","Type":"ContainerStarted","Data":"f5f8d949c25ff10ebe26e5d396230a24ae728c8917ef0f16f67b40064b20aabf"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.519309 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.521274 4886 generic.go:334] "Generic (PLEG): container finished" podID="c1744255-6bdc-4c0d-9f1e-70119127e182" containerID="bf8952d5b2d2abe99ca729e2c993d69528cbfb105b3ff209175fecab2d781d0c" exitCode=0 Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.521322 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" event={"ID":"c1744255-6bdc-4c0d-9f1e-70119127e182","Type":"ContainerDied","Data":"bf8952d5b2d2abe99ca729e2c993d69528cbfb105b3ff209175fecab2d781d0c"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.521341 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" event={"ID":"c1744255-6bdc-4c0d-9f1e-70119127e182","Type":"ContainerStarted","Data":"e20edddd188e80f887ff0afc9866462d0930d6f276657c59f37f1e8da1b7b82d"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.523326 4886 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-r4qzm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.523379 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" podUID="14c55450-dcee-4aee-8153-9ea2ff49b659" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.524346 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" event={"ID":"8107785d-acc9-4fdf-8f93-21f2b4a62c61","Type":"ContainerStarted","Data":"17b9deea77db82c9dba8888ddfb8a4705f0372a44fc61bc88f6172c2198170c6"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.524382 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" event={"ID":"8107785d-acc9-4fdf-8f93-21f2b4a62c61","Type":"ContainerStarted","Data":"2def71ed5b9a28dd32fb13ec0bf62a7ea5d6406bcec42352348a01b3e58f6481"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.524394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" event={"ID":"8107785d-acc9-4fdf-8f93-21f2b4a62c61","Type":"ContainerStarted","Data":"8427aca7ba06bb91ebec0298b9809450389dff0c4b64545c2c96aad5cd313167"} Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.530101 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.552752 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.571231 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.590738 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.610742 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.630347 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.650782 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.670655 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.690780 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.710347 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.730297 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.749827 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.770016 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.790434 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.807874 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-webhook-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.807971 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.814114 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-webhook-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.814742 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/094b2153-374f-4595-ae06-2655b16d69b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.830588 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.850557 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.870785 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.903636 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/54b85e17-fe41-4ae4-8e77-a3654421f751-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zhsfs\" (UID: \"54b85e17-fe41-4ae4-8e77-a3654421f751\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.926919 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg5n\" (UniqueName: \"kubernetes.io/projected/e934f0a9-f87b-4f86-8aad-47fa6927a3a6-kube-api-access-fgg5n\") pod \"machine-approver-56656f9798-2dzmw\" (UID: \"e934f0a9-f87b-4f86-8aad-47fa6927a3a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.934776 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.944897 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km96w\" (UniqueName: \"kubernetes.io/projected/8b226bf0-ae7d-435b-9470-70dfb371f38e-kube-api-access-km96w\") pod \"oauth-openshift-558db77b4-t996r\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:20 crc kubenswrapper[4886]: W0314 08:31:20.964879 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode934f0a9_f87b_4f86_8aad_47fa6927a3a6.slice/crio-ccce6e5a54700b10189604d4fdd0a4d553e7f3a2b25f2ff87b554fcb417464bc WatchSource:0}: Error finding container ccce6e5a54700b10189604d4fdd0a4d553e7f3a2b25f2ff87b554fcb417464bc: Status 404 returned error can't find the container with id ccce6e5a54700b10189604d4fdd0a4d553e7f3a2b25f2ff87b554fcb417464bc Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.965787 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w462w\" (UniqueName: \"kubernetes.io/projected/2778f200-cefa-4b41-9bc5-f600415f2387-kube-api-access-w462w\") pod \"cluster-samples-operator-665b6dd947-gm27b\" (UID: \"2778f200-cefa-4b41-9bc5-f600415f2387\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:20 crc kubenswrapper[4886]: I0314 08:31:20.988020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btr5r\" (UniqueName: \"kubernetes.io/projected/9270f20e-8365-43e2-9d4b-b067780c0804-kube-api-access-btr5r\") pod \"etcd-operator-b45778765-2sj7h\" (UID: \"9270f20e-8365-43e2-9d4b-b067780c0804\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.008089 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.038243 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl4k6\" (UniqueName: \"kubernetes.io/projected/b5fc2483-aab1-4487-a0ad-b6c3183826a5-kube-api-access-kl4k6\") pod \"openshift-apiserver-operator-796bbdcf4f-7xv44\" (UID: \"b5fc2483-aab1-4487-a0ad-b6c3183826a5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.041865 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzm9\" (UniqueName: \"kubernetes.io/projected/0228f0fd-9323-456c-9291-6150db291cf4-kube-api-access-fwzm9\") pod \"console-operator-58897d9998-lf5vf\" (UID: \"0228f0fd-9323-456c-9291-6150db291cf4\") " pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.057880 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bh6s\" (UniqueName: \"kubernetes.io/projected/a8352432-8b6e-4a89-b830-379796727237-kube-api-access-8bh6s\") pod \"openshift-config-operator-7777fb866f-x4qmh\" (UID: \"a8352432-8b6e-4a89-b830-379796727237\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.064398 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kg7k\" (UniqueName: \"kubernetes.io/projected/5a8def3c-80e8-4f81-8518-202af1613e6f-kube-api-access-4kg7k\") pod \"route-controller-manager-6576b87f9c-dknmx\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.088829 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8mk2\" (UniqueName: \"kubernetes.io/projected/5d14f041-e1b8-4b93-a893-946dbecf44aa-kube-api-access-w8mk2\") pod \"dns-operator-744455d44c-j46q2\" (UID: \"5d14f041-e1b8-4b93-a893-946dbecf44aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.108052 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwn94\" (UniqueName: \"kubernetes.io/projected/488db7c4-ca9c-4229-9ac5-dc19ce5443e4-kube-api-access-bwn94\") pod \"openshift-controller-manager-operator-756b6f6bc6-4ph2z\" (UID: \"488db7c4-ca9c-4229-9ac5-dc19ce5443e4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.108750 4886 request.go:700] Waited for 1.926154808s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.130759 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4jmr\" (UniqueName: \"kubernetes.io/projected/a312fb44-823b-44ec-8312-0d83b990e9cd-kube-api-access-t4jmr\") pod \"console-f9d7485db-wmcc2\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.144234 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kk4w\" (UniqueName: \"kubernetes.io/projected/37d01daf-edd5-4dd2-8a4a-40165f8d0275-kube-api-access-5kk4w\") pod \"authentication-operator-69f744f599-jm5tw\" (UID: \"37d01daf-edd5-4dd2-8a4a-40165f8d0275\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.150682 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.172611 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.174905 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.190834 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.211917 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.222949 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs"] Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.227304 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.230048 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 08:31:21 crc kubenswrapper[4886]: W0314 08:31:21.240678 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54b85e17_fe41_4ae4_8e77_a3654421f751.slice/crio-85b28f75f06fc13fb45061657cc85e69bdcc13abb67b852fc41bfd39ab9651bb WatchSource:0}: Error finding container 85b28f75f06fc13fb45061657cc85e69bdcc13abb67b852fc41bfd39ab9651bb: Status 404 returned error can't find the container with id 85b28f75f06fc13fb45061657cc85e69bdcc13abb67b852fc41bfd39ab9651bb Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.244258 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.250727 4886 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.251162 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.259172 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.267753 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.270054 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.274138 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.283053 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.288633 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.291558 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.294987 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.302845 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.328532 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.338700 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.381270 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7d22\" (UniqueName: \"kubernetes.io/projected/094b2153-374f-4595-ae06-2655b16d69b9-kube-api-access-q7d22\") pod \"packageserver-d55dfcdfc-knppr\" (UID: \"094b2153-374f-4595-ae06-2655b16d69b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.394162 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-794q4\" (UniqueName: \"kubernetes.io/projected/5fb33ac2-d4aa-49b0-9007-33af11834a96-kube-api-access-794q4\") pod \"router-default-5444994796-8lqwb\" (UID: \"5fb33ac2-d4aa-49b0-9007-33af11834a96\") " pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.421140 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlf7\" (UniqueName: \"kubernetes.io/projected/feeab5ae-f3ec-4590-8625-00e98fb5064b-kube-api-access-5rlf7\") pod \"downloads-7954f5f757-hznhw\" (UID: \"feeab5ae-f3ec-4590-8625-00e98fb5064b\") " pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.445080 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.451735 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jm5tw"] Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523691 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f276fd1b-0a21-47c7-95a4-7ccc355773ab-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523741 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2befe321-cfe9-4032-b949-3de718efbf7c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tfh9h\" (UID: \"2befe321-cfe9-4032-b949-3de718efbf7c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523761 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6047b878-24b2-43a1-8afb-3321319d2a1b-serving-cert\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523795 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxrt\" (UniqueName: \"kubernetes.io/projected/df316737-4efd-4f41-a5ff-46740ee48d48-kube-api-access-mnxrt\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523847 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df316737-4efd-4f41-a5ff-46740ee48d48-proxy-tls\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523863 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523905 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfbeb8db-2612-468d-8354-32ee6373f57e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523922 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f276fd1b-0a21-47c7-95a4-7ccc355773ab-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523937 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523959 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-trusted-ca\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523974 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz89g\" (UniqueName: \"kubernetes.io/projected/a109bed2-b994-4808-acac-56741af98cca-kube-api-access-pz89g\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.523989 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4bc7\" (UniqueName: \"kubernetes.io/projected/3c3e4726-bb4a-45be-9c3a-a791c4a42380-kube-api-access-b4bc7\") pod \"control-plane-machine-set-operator-78cbb6b69f-lxn5q\" (UID: \"3c3e4726-bb4a-45be-9c3a-a791c4a42380\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524021 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0822fc90-2e55-414e-8381-00d89382a00f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524040 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-profile-collector-cert\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524056 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f276fd1b-0a21-47c7-95a4-7ccc355773ab-config\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524078 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-srv-cert\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524151 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524186 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7ad8d4f-d958-43b7-b84d-c8672642d21b-secret-volume\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524201 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnbgs\" (UniqueName: \"kubernetes.io/projected/d7ad8d4f-d958-43b7-b84d-c8672642d21b-kube-api-access-lnbgs\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524235 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e008272-f83f-420d-848c-a05f8dfef580-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524251 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8476cbd2-082f-4766-94ed-8cf08a01c98a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524267 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgz97\" (UniqueName: \"kubernetes.io/projected/7fd40608-be63-40ae-9e2f-d0969c399390-kube-api-access-xgz97\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524336 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0822fc90-2e55-414e-8381-00d89382a00f-metrics-tls\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524436 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c3e4726-bb4a-45be-9c3a-a791c4a42380-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lxn5q\" (UID: \"3c3e4726-bb4a-45be-9c3a-a791c4a42380\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524518 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59z2\" (UniqueName: \"kubernetes.io/projected/e10181e4-ca00-44ad-8153-7a2b1d8c7897-kube-api-access-t59z2\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524537 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l7bn\" (UniqueName: \"kubernetes.io/projected/dd360541-a4f3-4d2f-8085-e467feebb007-kube-api-access-9l7bn\") pod \"package-server-manager-789f6589d5-rhw5h\" (UID: \"dd360541-a4f3-4d2f-8085-e467feebb007\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pcr\" (UniqueName: \"kubernetes.io/projected/9e008272-f83f-420d-848c-a05f8dfef580-kube-api-access-84pcr\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524579 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvcj\" (UniqueName: \"kubernetes.io/projected/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-kube-api-access-ktvcj\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524633 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-tls\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524654 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7ad8d4f-d958-43b7-b84d-c8672642d21b-config-volume\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524669 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ww6\" (UniqueName: \"kubernetes.io/projected/6047b878-24b2-43a1-8afb-3321319d2a1b-kube-api-access-47ww6\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524701 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/266a4a60-8fb5-4685-b4ac-621f93829611-kube-api-access-wfztb\") pod \"auto-csr-approver-29557950-fd7np\" (UID: \"266a4a60-8fb5-4685-b4ac-621f93829611\") " pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524719 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524765 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524782 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fd40608-be63-40ae-9e2f-d0969c399390-srv-cert\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524840 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8476cbd2-082f-4766-94ed-8cf08a01c98a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524871 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a109bed2-b994-4808-acac-56741af98cca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524886 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524917 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-certificates\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524932 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6047b878-24b2-43a1-8afb-3321319d2a1b-config\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524952 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbhj5\" (UniqueName: \"kubernetes.io/projected/0822fc90-2e55-414e-8381-00d89382a00f-kube-api-access-vbhj5\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.524979 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxgf4\" (UniqueName: \"kubernetes.io/projected/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-kube-api-access-mxgf4\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8476cbd2-082f-4766-94ed-8cf08a01c98a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527057 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgrdx\" (UniqueName: \"kubernetes.io/projected/6cf72daf-2f75-41d3-b94b-51479ba7d2cf-kube-api-access-cgrdx\") pod \"migrator-59844c95c7-bgwrn\" (UID: \"6cf72daf-2f75-41d3-b94b-51479ba7d2cf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527204 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df316737-4efd-4f41-a5ff-46740ee48d48-images\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527221 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd360541-a4f3-4d2f-8085-e467feebb007-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rhw5h\" (UID: \"dd360541-a4f3-4d2f-8085-e467feebb007\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527243 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e008272-f83f-420d-848c-a05f8dfef580-proxy-tls\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527322 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfbeb8db-2612-468d-8354-32ee6373f57e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527411 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt57x\" (UniqueName: \"kubernetes.io/projected/2befe321-cfe9-4032-b949-3de718efbf7c-kube-api-access-vt57x\") pod \"multus-admission-controller-857f4d67dd-tfh9h\" (UID: \"2befe321-cfe9-4032-b949-3de718efbf7c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527470 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fd40608-be63-40ae-9e2f-d0969c399390-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527485 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85wdd\" (UniqueName: \"kubernetes.io/projected/ea7323d6-f41b-4251-ae88-aa34a5714182-kube-api-access-85wdd\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527500 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-bound-sa-token\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527516 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df316737-4efd-4f41-a5ff-46740ee48d48-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527553 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e10181e4-ca00-44ad-8153-7a2b1d8c7897-signing-cabundle\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527594 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2dz\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-kube-api-access-7h2dz\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527624 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0822fc90-2e55-414e-8381-00d89382a00f-trusted-ca\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527639 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a109bed2-b994-4808-acac-56741af98cca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.527655 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e10181e4-ca00-44ad-8153-7a2b1d8c7897-signing-key\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.529580 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.029557469 +0000 UTC m=+217.278009206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.562161 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" event={"ID":"37d01daf-edd5-4dd2-8a4a-40165f8d0275","Type":"ContainerStarted","Data":"07f006450c23ceb432c9b1cbfd958f9ef77a8c20552019effa9492e50a71627e"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.577771 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" event={"ID":"54b85e17-fe41-4ae4-8e77-a3654421f751","Type":"ContainerStarted","Data":"85b28f75f06fc13fb45061657cc85e69bdcc13abb67b852fc41bfd39ab9651bb"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.580741 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44"] Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.630521 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" event={"ID":"c1744255-6bdc-4c0d-9f1e-70119127e182","Type":"ContainerStarted","Data":"2acaf841a6e11affa0ca1c629cbb1d17e604a071520ea4d96c3a8dbea467ee65"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.630946 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.633860 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634099 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a93d506-f295-45af-9692-27b4da556007-metrics-tls\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634158 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8476cbd2-082f-4766-94ed-8cf08a01c98a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634181 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgrdx\" (UniqueName: \"kubernetes.io/projected/6cf72daf-2f75-41d3-b94b-51479ba7d2cf-kube-api-access-cgrdx\") pod \"migrator-59844c95c7-bgwrn\" (UID: \"6cf72daf-2f75-41d3-b94b-51479ba7d2cf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634207 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df316737-4efd-4f41-a5ff-46740ee48d48-images\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634226 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd360541-a4f3-4d2f-8085-e467feebb007-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rhw5h\" (UID: \"dd360541-a4f3-4d2f-8085-e467feebb007\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634257 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e008272-f83f-420d-848c-a05f8dfef580-proxy-tls\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634290 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfbeb8db-2612-468d-8354-32ee6373f57e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634306 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a93d506-f295-45af-9692-27b4da556007-config-volume\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634327 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnb6p\" (UniqueName: \"kubernetes.io/projected/c559a474-d30d-435f-8e7e-5f9d56884bac-kube-api-access-wnb6p\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634353 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt57x\" (UniqueName: \"kubernetes.io/projected/2befe321-cfe9-4032-b949-3de718efbf7c-kube-api-access-vt57x\") pod \"multus-admission-controller-857f4d67dd-tfh9h\" (UID: \"2befe321-cfe9-4032-b949-3de718efbf7c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634902 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fd40608-be63-40ae-9e2f-d0969c399390-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634926 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85wdd\" (UniqueName: \"kubernetes.io/projected/ea7323d6-f41b-4251-ae88-aa34a5714182-kube-api-access-85wdd\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634941 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-bound-sa-token\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634955 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df316737-4efd-4f41-a5ff-46740ee48d48-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e10181e4-ca00-44ad-8153-7a2b1d8c7897-signing-cabundle\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.634998 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsz24\" (UniqueName: \"kubernetes.io/projected/6b322a91-a747-474d-9111-883781a4f012-kube-api-access-zsz24\") pod \"ingress-canary-qj8pq\" (UID: \"6b322a91-a747-474d-9111-883781a4f012\") " pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635016 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2dz\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-kube-api-access-7h2dz\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635036 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0822fc90-2e55-414e-8381-00d89382a00f-trusted-ca\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635053 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a109bed2-b994-4808-acac-56741af98cca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635068 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e10181e4-ca00-44ad-8153-7a2b1d8c7897-signing-key\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635146 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-plugins-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635166 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f276fd1b-0a21-47c7-95a4-7ccc355773ab-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635184 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2befe321-cfe9-4032-b949-3de718efbf7c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tfh9h\" (UID: \"2befe321-cfe9-4032-b949-3de718efbf7c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635199 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6047b878-24b2-43a1-8afb-3321319d2a1b-serving-cert\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635222 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxrt\" (UniqueName: \"kubernetes.io/projected/df316737-4efd-4f41-a5ff-46740ee48d48-kube-api-access-mnxrt\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635248 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df316737-4efd-4f41-a5ff-46740ee48d48-proxy-tls\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635264 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635292 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfbeb8db-2612-468d-8354-32ee6373f57e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635307 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f276fd1b-0a21-47c7-95a4-7ccc355773ab-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635324 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635341 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c559a474-d30d-435f-8e7e-5f9d56884bac-certs\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635358 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-trusted-ca\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635375 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz89g\" (UniqueName: \"kubernetes.io/projected/a109bed2-b994-4808-acac-56741af98cca-kube-api-access-pz89g\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635391 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4bc7\" (UniqueName: \"kubernetes.io/projected/3c3e4726-bb4a-45be-9c3a-a791c4a42380-kube-api-access-b4bc7\") pod \"control-plane-machine-set-operator-78cbb6b69f-lxn5q\" (UID: \"3c3e4726-bb4a-45be-9c3a-a791c4a42380\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635410 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0822fc90-2e55-414e-8381-00d89382a00f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636162 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfbeb8db-2612-468d-8354-32ee6373f57e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.635427 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-profile-collector-cert\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.636258 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.136239808 +0000 UTC m=+217.384691445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636284 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b322a91-a747-474d-9111-883781a4f012-cert\") pod \"ingress-canary-qj8pq\" (UID: \"6b322a91-a747-474d-9111-883781a4f012\") " pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636320 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f276fd1b-0a21-47c7-95a4-7ccc355773ab-config\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-srv-cert\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636386 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636428 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7ad8d4f-d958-43b7-b84d-c8672642d21b-secret-volume\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636445 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnbgs\" (UniqueName: \"kubernetes.io/projected/d7ad8d4f-d958-43b7-b84d-c8672642d21b-kube-api-access-lnbgs\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636464 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-socket-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636481 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49nj9\" (UniqueName: \"kubernetes.io/projected/86449941-3f6f-4c02-a717-ae9d48a7c464-kube-api-access-49nj9\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636506 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e008272-f83f-420d-848c-a05f8dfef580-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636523 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8476cbd2-082f-4766-94ed-8cf08a01c98a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636542 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgz97\" (UniqueName: \"kubernetes.io/projected/7fd40608-be63-40ae-9e2f-d0969c399390-kube-api-access-xgz97\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636565 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-registration-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636584 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-csi-data-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636612 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0822fc90-2e55-414e-8381-00d89382a00f-metrics-tls\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636630 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-mountpoint-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636651 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c3e4726-bb4a-45be-9c3a-a791c4a42380-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lxn5q\" (UID: \"3c3e4726-bb4a-45be-9c3a-a791c4a42380\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636673 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr7gd\" (UniqueName: \"kubernetes.io/projected/5a93d506-f295-45af-9692-27b4da556007-kube-api-access-jr7gd\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636691 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c559a474-d30d-435f-8e7e-5f9d56884bac-node-bootstrap-token\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636725 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59z2\" (UniqueName: \"kubernetes.io/projected/e10181e4-ca00-44ad-8153-7a2b1d8c7897-kube-api-access-t59z2\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636743 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l7bn\" (UniqueName: \"kubernetes.io/projected/dd360541-a4f3-4d2f-8085-e467feebb007-kube-api-access-9l7bn\") pod \"package-server-manager-789f6589d5-rhw5h\" (UID: \"dd360541-a4f3-4d2f-8085-e467feebb007\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636760 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84pcr\" (UniqueName: \"kubernetes.io/projected/9e008272-f83f-420d-848c-a05f8dfef580-kube-api-access-84pcr\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636778 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvcj\" (UniqueName: \"kubernetes.io/projected/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-kube-api-access-ktvcj\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636811 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-tls\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636827 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7ad8d4f-d958-43b7-b84d-c8672642d21b-config-volume\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636844 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ww6\" (UniqueName: \"kubernetes.io/projected/6047b878-24b2-43a1-8afb-3321319d2a1b-kube-api-access-47ww6\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636863 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/266a4a60-8fb5-4685-b4ac-621f93829611-kube-api-access-wfztb\") pod \"auto-csr-approver-29557950-fd7np\" (UID: \"266a4a60-8fb5-4685-b4ac-621f93829611\") " pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636881 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636902 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636918 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fd40608-be63-40ae-9e2f-d0969c399390-srv-cert\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8476cbd2-082f-4766-94ed-8cf08a01c98a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a109bed2-b994-4808-acac-56741af98cca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.636985 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.637025 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-certificates\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.637041 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6047b878-24b2-43a1-8afb-3321319d2a1b-config\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.637060 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbhj5\" (UniqueName: \"kubernetes.io/projected/0822fc90-2e55-414e-8381-00d89382a00f-kube-api-access-vbhj5\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.637084 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxgf4\" (UniqueName: \"kubernetes.io/projected/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-kube-api-access-mxgf4\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.640099 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0822fc90-2e55-414e-8381-00d89382a00f-trusted-ca\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.641462 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df316737-4efd-4f41-a5ff-46740ee48d48-images\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.642942 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.647855 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-profile-collector-cert\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.648535 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.649409 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" event={"ID":"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119","Type":"ContainerStarted","Data":"495d70215f57e9fc4e7fe393cc61efe3358d357882dd3fac034744d66ad2a588"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.649450 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" event={"ID":"ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119","Type":"ContainerStarted","Data":"c07d8d5dbe98a876411a491d00271a48174b0327844776d719e487a370a7ec27"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.650181 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-trusted-ca\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.652309 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df316737-4efd-4f41-a5ff-46740ee48d48-proxy-tls\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.653054 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/3c3e4726-bb4a-45be-9c3a-a791c4a42380-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lxn5q\" (UID: \"3c3e4726-bb4a-45be-9c3a-a791c4a42380\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.654014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df316737-4efd-4f41-a5ff-46740ee48d48-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.656510 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9e008272-f83f-420d-848c-a05f8dfef580-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.658694 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e008272-f83f-420d-848c-a05f8dfef580-proxy-tls\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.659390 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e10181e4-ca00-44ad-8153-7a2b1d8c7897-signing-key\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.660388 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-certificates\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.661409 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.662236 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" event={"ID":"e934f0a9-f87b-4f86-8aad-47fa6927a3a6","Type":"ContainerStarted","Data":"4c96a8de06c863204db7ca8783923866bf242312c72eec21a1e4a558dd688788"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.662262 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" event={"ID":"e934f0a9-f87b-4f86-8aad-47fa6927a3a6","Type":"ContainerStarted","Data":"ccce6e5a54700b10189604d4fdd0a4d553e7f3a2b25f2ff87b554fcb417464bc"} Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.662971 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-tls\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.663561 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fd40608-be63-40ae-9e2f-d0969c399390-srv-cert\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.664108 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2befe321-cfe9-4032-b949-3de718efbf7c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-tfh9h\" (UID: \"2befe321-cfe9-4032-b949-3de718efbf7c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.664921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f276fd1b-0a21-47c7-95a4-7ccc355773ab-config\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.665012 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6047b878-24b2-43a1-8afb-3321319d2a1b-config\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.670386 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.170372206 +0000 UTC m=+217.418823843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.672989 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.673450 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd360541-a4f3-4d2f-8085-e467feebb007-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rhw5h\" (UID: \"dd360541-a4f3-4d2f-8085-e467feebb007\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.673715 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.673771 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e10181e4-ca00-44ad-8153-7a2b1d8c7897-signing-cabundle\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.688341 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0822fc90-2e55-414e-8381-00d89382a00f-metrics-tls\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.681642 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fd40608-be63-40ae-9e2f-d0969c399390-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.679921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6047b878-24b2-43a1-8afb-3321319d2a1b-serving-cert\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.694212 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f276fd1b-0a21-47c7-95a4-7ccc355773ab-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.694544 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfbeb8db-2612-468d-8354-32ee6373f57e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.694856 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737537 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737747 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-plugins-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737820 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c559a474-d30d-435f-8e7e-5f9d56884bac-certs\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737862 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b322a91-a747-474d-9111-883781a4f012-cert\") pod \"ingress-canary-qj8pq\" (UID: \"6b322a91-a747-474d-9111-883781a4f012\") " pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737918 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-socket-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737939 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49nj9\" (UniqueName: \"kubernetes.io/projected/86449941-3f6f-4c02-a717-ae9d48a7c464-kube-api-access-49nj9\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.737991 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-registration-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738007 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-csi-data-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738041 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-mountpoint-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738058 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c559a474-d30d-435f-8e7e-5f9d56884bac-node-bootstrap-token\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738077 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr7gd\" (UniqueName: \"kubernetes.io/projected/5a93d506-f295-45af-9692-27b4da556007-kube-api-access-jr7gd\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738246 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a93d506-f295-45af-9692-27b4da556007-metrics-tls\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738340 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a93d506-f295-45af-9692-27b4da556007-config-volume\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738360 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnb6p\" (UniqueName: \"kubernetes.io/projected/c559a474-d30d-435f-8e7e-5f9d56884bac-kube-api-access-wnb6p\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.738412 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsz24\" (UniqueName: \"kubernetes.io/projected/6b322a91-a747-474d-9111-883781a4f012-kube-api-access-zsz24\") pod \"ingress-canary-qj8pq\" (UID: \"6b322a91-a747-474d-9111-883781a4f012\") " pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.739002 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.238987184 +0000 UTC m=+217.487438821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.739095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-plugins-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.744023 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-csi-data-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.744457 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-registration-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.744605 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-mountpoint-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.746733 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/86449941-3f6f-4c02-a717-ae9d48a7c464-socket-dir\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.750006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-srv-cert\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.750677 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8476cbd2-082f-4766-94ed-8cf08a01c98a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.760031 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxgf4\" (UniqueName: \"kubernetes.io/projected/39a6cfec-d809-46d0-8ef2-bde4b2d99c62-kube-api-access-mxgf4\") pod \"catalog-operator-68c6474976-n62ck\" (UID: \"39a6cfec-d809-46d0-8ef2-bde4b2d99c62\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.760570 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a109bed2-b994-4808-acac-56741af98cca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.762520 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a93d506-f295-45af-9692-27b4da556007-config-volume\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.765887 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx"] Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.790202 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b322a91-a747-474d-9111-883781a4f012-cert\") pod \"ingress-canary-qj8pq\" (UID: \"6b322a91-a747-474d-9111-883781a4f012\") " pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.790601 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7ad8d4f-d958-43b7-b84d-c8672642d21b-secret-volume\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.793525 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt57x\" (UniqueName: \"kubernetes.io/projected/2befe321-cfe9-4032-b949-3de718efbf7c-kube-api-access-vt57x\") pod \"multus-admission-controller-857f4d67dd-tfh9h\" (UID: \"2befe321-cfe9-4032-b949-3de718efbf7c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.794644 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgrdx\" (UniqueName: \"kubernetes.io/projected/6cf72daf-2f75-41d3-b94b-51479ba7d2cf-kube-api-access-cgrdx\") pod \"migrator-59844c95c7-bgwrn\" (UID: \"6cf72daf-2f75-41d3-b94b-51479ba7d2cf\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.839378 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.839744 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.339728632 +0000 UTC m=+217.588180269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.843333 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7ad8d4f-d958-43b7-b84d-c8672642d21b-config-volume\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.843333 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8476cbd2-082f-4766-94ed-8cf08a01c98a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.880465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a93d506-f295-45af-9692-27b4da556007-metrics-tls\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.880654 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a109bed2-b994-4808-acac-56741af98cca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.881170 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c559a474-d30d-435f-8e7e-5f9d56884bac-node-bootstrap-token\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.883257 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c559a474-d30d-435f-8e7e-5f9d56884bac-certs\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.883498 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxrt\" (UniqueName: \"kubernetes.io/projected/df316737-4efd-4f41-a5ff-46740ee48d48-kube-api-access-mnxrt\") pod \"machine-config-operator-74547568cd-5984c\" (UID: \"df316737-4efd-4f41-a5ff-46740ee48d48\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.884143 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/266a4a60-8fb5-4685-b4ac-621f93829611-kube-api-access-wfztb\") pod \"auto-csr-approver-29557950-fd7np\" (UID: \"266a4a60-8fb5-4685-b4ac-621f93829611\") " pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.886717 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l7bn\" (UniqueName: \"kubernetes.io/projected/dd360541-a4f3-4d2f-8085-e467feebb007-kube-api-access-9l7bn\") pod \"package-server-manager-789f6589d5-rhw5h\" (UID: \"dd360541-a4f3-4d2f-8085-e467feebb007\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.887359 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pcr\" (UniqueName: \"kubernetes.io/projected/9e008272-f83f-420d-848c-a05f8dfef580-kube-api-access-84pcr\") pod \"machine-config-controller-84d6567774-ql2q5\" (UID: \"9e008272-f83f-420d-848c-a05f8dfef580\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.887628 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnbgs\" (UniqueName: \"kubernetes.io/projected/d7ad8d4f-d958-43b7-b84d-c8672642d21b-kube-api-access-lnbgs\") pod \"collect-profiles-29557950-j78sv\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.897648 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59z2\" (UniqueName: \"kubernetes.io/projected/e10181e4-ca00-44ad-8153-7a2b1d8c7897-kube-api-access-t59z2\") pod \"service-ca-9c57cc56f-k7qk8\" (UID: \"e10181e4-ca00-44ad-8153-7a2b1d8c7897\") " pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.897806 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvcj\" (UniqueName: \"kubernetes.io/projected/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-kube-api-access-ktvcj\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.914007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85wdd\" (UniqueName: \"kubernetes.io/projected/ea7323d6-f41b-4251-ae88-aa34a5714182-kube-api-access-85wdd\") pod \"marketplace-operator-79b997595-8lxpz\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.924869 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-bound-sa-token\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.941530 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.942025 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.441996345 +0000 UTC m=+217.690447982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.942219 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:21 crc kubenswrapper[4886]: E0314 08:31:21.942589 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.442582281 +0000 UTC m=+217.691033918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.944291 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgz97\" (UniqueName: \"kubernetes.io/projected/7fd40608-be63-40ae-9e2f-d0969c399390-kube-api-access-xgz97\") pod \"olm-operator-6b444d44fb-2bkdt\" (UID: \"7fd40608-be63-40ae-9e2f-d0969c399390\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.962884 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.966429 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0822fc90-2e55-414e-8381-00d89382a00f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.971767 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.978077 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbhj5\" (UniqueName: \"kubernetes.io/projected/0822fc90-2e55-414e-8381-00d89382a00f-kube-api-access-vbhj5\") pod \"ingress-operator-5b745b69d9-rjdjq\" (UID: \"0822fc90-2e55-414e-8381-00d89382a00f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.993715 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2sj7h"] Mar 14 08:31:21 crc kubenswrapper[4886]: I0314 08:31:21.996985 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.002388 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.011368 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.014804 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jldpx\" (UID: \"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.022079 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.025679 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz89g\" (UniqueName: \"kubernetes.io/projected/a109bed2-b994-4808-acac-56741af98cca-kube-api-access-pz89g\") pod \"kube-storage-version-migrator-operator-b67b599dd-kwc7b\" (UID: \"a109bed2-b994-4808-acac-56741af98cca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.026508 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.043860 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.043907 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8476cbd2-082f-4766-94ed-8cf08a01c98a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9kl2c\" (UID: \"8476cbd2-082f-4766-94ed-8cf08a01c98a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.044113 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.544096362 +0000 UTC m=+217.792547999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.044302 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.044688 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.544675469 +0000 UTC m=+217.793127106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.050639 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ww6\" (UniqueName: \"kubernetes.io/projected/6047b878-24b2-43a1-8afb-3321319d2a1b-kube-api-access-47ww6\") pod \"service-ca-operator-777779d784-d48xp\" (UID: \"6047b878-24b2-43a1-8afb-3321319d2a1b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.052170 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.053864 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.059571 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.077831 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.090833 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.092913 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4bc7\" (UniqueName: \"kubernetes.io/projected/3c3e4726-bb4a-45be-9c3a-a791c4a42380-kube-api-access-b4bc7\") pod \"control-plane-machine-set-operator-78cbb6b69f-lxn5q\" (UID: \"3c3e4726-bb4a-45be-9c3a-a791c4a42380\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.099240 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.112019 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f276fd1b-0a21-47c7-95a4-7ccc355773ab-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9brkw\" (UID: \"f276fd1b-0a21-47c7-95a4-7ccc355773ab\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.121889 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2dz\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-kube-api-access-7h2dz\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.139484 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49nj9\" (UniqueName: \"kubernetes.io/projected/86449941-3f6f-4c02-a717-ae9d48a7c464-kube-api-access-49nj9\") pod \"csi-hostpathplugin-vdd8t\" (UID: \"86449941-3f6f-4c02-a717-ae9d48a7c464\") " pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.145243 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.645215652 +0000 UTC m=+217.893667279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.145285 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.145648 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.145985 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.645976353 +0000 UTC m=+217.894427990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.167732 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr7gd\" (UniqueName: \"kubernetes.io/projected/5a93d506-f295-45af-9692-27b4da556007-kube-api-access-jr7gd\") pod \"dns-default-c9qzb\" (UID: \"5a93d506-f295-45af-9692-27b4da556007\") " pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.176093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnb6p\" (UniqueName: \"kubernetes.io/projected/c559a474-d30d-435f-8e7e-5f9d56884bac-kube-api-access-wnb6p\") pod \"machine-config-server-xcvj2\" (UID: \"c559a474-d30d-435f-8e7e-5f9d56884bac\") " pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.184325 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.186616 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t996r"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.196846 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsz24\" (UniqueName: \"kubernetes.io/projected/6b322a91-a747-474d-9111-883781a4f012-kube-api-access-zsz24\") pod \"ingress-canary-qj8pq\" (UID: \"6b322a91-a747-474d-9111-883781a4f012\") " pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.215862 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.223489 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.233714 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.242364 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.250074 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.250103 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.250470 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.750453179 +0000 UTC m=+217.998904816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.256645 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.274076 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.283343 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.304585 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lf5vf"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.335353 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wmcc2"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.354203 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.354450 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.854440661 +0000 UTC m=+218.102892298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.354917 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j46q2"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.365491 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hznhw"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.403328 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.408888 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qj8pq" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.416155 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" Mar 14 08:31:22 crc kubenswrapper[4886]: W0314 08:31:22.423332 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0228f0fd_9323_456c_9291_6150db291cf4.slice/crio-112a8c23a53c5589fdbf7e9e8c99bc956d2ae086ff4755652949d8eca5997eb1 WatchSource:0}: Error finding container 112a8c23a53c5589fdbf7e9e8c99bc956d2ae086ff4755652949d8eca5997eb1: Status 404 returned error can't find the container with id 112a8c23a53c5589fdbf7e9e8c99bc956d2ae086ff4755652949d8eca5997eb1 Mar 14 08:31:22 crc kubenswrapper[4886]: W0314 08:31:22.430280 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda312fb44_823b_44ec_8312_0d83b990e9cd.slice/crio-204c2720ac702bd4600711fcadaaaed3db50ef7659ecebbad48ab4ca3b27e444 WatchSource:0}: Error finding container 204c2720ac702bd4600711fcadaaaed3db50ef7659ecebbad48ab4ca3b27e444: Status 404 returned error can't find the container with id 204c2720ac702bd4600711fcadaaaed3db50ef7659ecebbad48ab4ca3b27e444 Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.432493 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xcvj2" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.460266 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.460828 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:22.960813451 +0000 UTC m=+218.209265088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.565619 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.565985 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.065965066 +0000 UTC m=+218.314416793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.648387 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k7qk8"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.669835 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.670029 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.170002169 +0000 UTC m=+218.418453806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.670302 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.670629 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.170617267 +0000 UTC m=+218.419068904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.771778 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-fd7np"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.772589 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.773862 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.273836006 +0000 UTC m=+218.522287643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.792062 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" event={"ID":"a8352432-8b6e-4a89-b830-379796727237","Type":"ContainerStarted","Data":"34936b656cd2dfd575629018da45bb6c6eec3702a6edfd9943463e1be76a82ce"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.804740 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" event={"ID":"5a8def3c-80e8-4f81-8518-202af1613e6f","Type":"ContainerStarted","Data":"c0cfe7744c88a113dbadf53a337b50e1e245c439ff304176198462c0dc2ef35b"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.824076 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.877684 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.878546 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmcc2" event={"ID":"a312fb44-823b-44ec-8312-0d83b990e9cd","Type":"ContainerStarted","Data":"204c2720ac702bd4600711fcadaaaed3db50ef7659ecebbad48ab4ca3b27e444"} Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.878783 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.378758335 +0000 UTC m=+218.627209972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.899446 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8lqwb" event={"ID":"5fb33ac2-d4aa-49b0-9007-33af11834a96","Type":"ContainerStarted","Data":"0c14e7aea499c1246723e28f0824b1b9c7590af2e9c106a640eac90b8fc63ed9"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.899496 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8lqwb" event={"ID":"5fb33ac2-d4aa-49b0-9007-33af11834a96","Type":"ContainerStarted","Data":"e63ba5845489edb1b41a26c27203cc65161bc9786b63636e86b6c27107b7f91b"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.908694 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" event={"ID":"9270f20e-8365-43e2-9d4b-b067780c0804","Type":"ContainerStarted","Data":"30d4b3828cc2627c214e8ef3be00555f6f609aa897579d7eede6a55de1f1e0eb"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.916630 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" event={"ID":"2778f200-cefa-4b41-9bc5-f600415f2387","Type":"ContainerStarted","Data":"682e9a6f488fba1cfb79bea59f05130d4241b5c9244c1033f6d424ff92c0658d"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.917459 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" event={"ID":"5d14f041-e1b8-4b93-a893-946dbecf44aa","Type":"ContainerStarted","Data":"895efdee965580cdbfc4ea62e23e22c1e42646fda69ecf3cfa09941bd8c3f864"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.918083 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" event={"ID":"488db7c4-ca9c-4229-9ac5-dc19ce5443e4","Type":"ContainerStarted","Data":"bc283d378892a60af8b97a14e8bca07db8dd4466e454fc92f103143e4e244537"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.919388 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" event={"ID":"e934f0a9-f87b-4f86-8aad-47fa6927a3a6","Type":"ContainerStarted","Data":"5568f0bee060e53b55025d71c464b7e2a2a230df6eabc5f6778474668f514709"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.930971 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" event={"ID":"8b226bf0-ae7d-435b-9470-70dfb371f38e","Type":"ContainerStarted","Data":"e44852c656ca2743103ecf45b0d067d5dd9cdffa3433bc552951cdd4d80b4990"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.933746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" event={"ID":"37d01daf-edd5-4dd2-8a4a-40165f8d0275","Type":"ContainerStarted","Data":"cee6d848f298c2d7da6d6a9348bf2751baae586b1f50f5c8fbdafececed142f4"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.934614 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.934790 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-tfh9h"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.952065 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" event={"ID":"54b85e17-fe41-4ae4-8e77-a3654421f751","Type":"ContainerStarted","Data":"3d9adbb932ef9e78b033f6f36e1521f48f3763d7958e55fa35fc6948add2263a"} Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.967207 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" podStartSLOduration=160.96718694 podStartE2EDuration="2m40.96718694s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:22.950639416 +0000 UTC m=+218.199091053" watchObservedRunningTime="2026-03-14 08:31:22.96718694 +0000 UTC m=+218.215638567" Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.975826 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h"] Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.981009 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:22 crc kubenswrapper[4886]: E0314 08:31:22.982591 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.482558691 +0000 UTC m=+218.731010328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:22 crc kubenswrapper[4886]: I0314 08:31:22.997622 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" podStartSLOduration=160.997604882 podStartE2EDuration="2m40.997604882s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:22.996459599 +0000 UTC m=+218.244911256" watchObservedRunningTime="2026-03-14 08:31:22.997604882 +0000 UTC m=+218.246056519" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.009249 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" event={"ID":"0228f0fd-9323-456c-9291-6150db291cf4","Type":"ContainerStarted","Data":"112a8c23a53c5589fdbf7e9e8c99bc956d2ae086ff4755652949d8eca5997eb1"} Mar 14 08:31:23 crc kubenswrapper[4886]: W0314 08:31:23.024632 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2befe321_cfe9_4032_b949_3de718efbf7c.slice/crio-4aa2c91844589845a9d451f1af5b9a4f7da09088607caa5005cdfbe896805bd9 WatchSource:0}: Error finding container 4aa2c91844589845a9d451f1af5b9a4f7da09088607caa5005cdfbe896805bd9: Status 404 returned error can't find the container with id 4aa2c91844589845a9d451f1af5b9a4f7da09088607caa5005cdfbe896805bd9 Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.025174 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hznhw" event={"ID":"feeab5ae-f3ec-4590-8625-00e98fb5064b","Type":"ContainerStarted","Data":"58fcb1c396b82febdb7223b2e60fd7e072ab7043f770dae2e0b2087cda63408b"} Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.079920 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dq2b2" podStartSLOduration=161.079902782 podStartE2EDuration="2m41.079902782s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.077530974 +0000 UTC m=+218.325982611" watchObservedRunningTime="2026-03-14 08:31:23.079902782 +0000 UTC m=+218.328354419" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.093812 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.095733 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.595719416 +0000 UTC m=+218.844171053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.109097 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" event={"ID":"094b2153-374f-4595-ae06-2655b16d69b9","Type":"ContainerStarted","Data":"521c68ad2f9648fe950a23c3646096d996a72da8a9edf4c3e6a9f8dffaa08ffb"} Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.166882 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" event={"ID":"b5fc2483-aab1-4487-a0ad-b6c3183826a5","Type":"ContainerStarted","Data":"ee0ee534dd28b57b2553fd8b2f34979b85dc582b259874ed2b887e04543ae0b4"} Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.166935 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" event={"ID":"b5fc2483-aab1-4487-a0ad-b6c3183826a5","Type":"ContainerStarted","Data":"c1ebf76870a55df586fe9758f5b3ce8af7e5c5ef9ecfbc72920613970af81df4"} Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.198493 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.199052 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.699029948 +0000 UTC m=+218.947481575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.224632 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-d48xp"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.301417 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.305432 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.805411268 +0000 UTC m=+219.053862895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.364471 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" podStartSLOduration=161.364450721 podStartE2EDuration="2m41.364450721s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.35850631 +0000 UTC m=+218.606957947" watchObservedRunningTime="2026-03-14 08:31:23.364450721 +0000 UTC m=+218.612902348" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.402411 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.402854 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:23.902838622 +0000 UTC m=+219.151290249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.514291 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.522148 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.022091731 +0000 UTC m=+219.270543368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.572362 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.577348 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.617956 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.618484 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.118465624 +0000 UTC m=+219.366917261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.635058 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.659256 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:23 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:23 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:23 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.659744 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.724455 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.724855 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.224840574 +0000 UTC m=+219.473292211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.779443 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.816831 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2dzmw" podStartSLOduration=161.816815492 podStartE2EDuration="2m41.816815492s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.813929529 +0000 UTC m=+219.062381156" watchObservedRunningTime="2026-03-14 08:31:23.816815492 +0000 UTC m=+219.065267129" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.824991 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.825267 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.325218033 +0000 UTC m=+219.573669660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.825813 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.826311 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.326296193 +0000 UTC m=+219.574747830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.847227 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xv44" podStartSLOduration=161.847212873 podStartE2EDuration="2m41.847212873s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.84641636 +0000 UTC m=+219.094867987" watchObservedRunningTime="2026-03-14 08:31:23.847212873 +0000 UTC m=+219.095664510" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.887254 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zhsfs" podStartSLOduration=161.88685223 podStartE2EDuration="2m41.88685223s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.88407593 +0000 UTC m=+219.132527567" watchObservedRunningTime="2026-03-14 08:31:23.88685223 +0000 UTC m=+219.135303867" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.924976 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jm5tw" podStartSLOduration=161.924954532 podStartE2EDuration="2m41.924954532s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.924804168 +0000 UTC m=+219.173255805" watchObservedRunningTime="2026-03-14 08:31:23.924954532 +0000 UTC m=+219.173406169" Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.929767 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:23 crc kubenswrapper[4886]: E0314 08:31:23.930138 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.43010621 +0000 UTC m=+219.678557847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.939992 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.952710 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5984c"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.963290 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.965961 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8lxpz"] Mar 14 08:31:23 crc kubenswrapper[4886]: I0314 08:31:23.969331 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8lqwb" podStartSLOduration=161.969316273 podStartE2EDuration="2m41.969316273s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:23.967692837 +0000 UTC m=+219.216144474" watchObservedRunningTime="2026-03-14 08:31:23.969316273 +0000 UTC m=+219.217767910" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.031880 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.032288 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.532276619 +0000 UTC m=+219.780728256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: W0314 08:31:24.077342 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea7323d6_f41b_4251_ae88_aa34a5714182.slice/crio-57310ba521abc6995588c4e414e207667a42d6880f3f7e7d8ca7771a71e498aa WatchSource:0}: Error finding container 57310ba521abc6995588c4e414e207667a42d6880f3f7e7d8ca7771a71e498aa: Status 404 returned error can't find the container with id 57310ba521abc6995588c4e414e207667a42d6880f3f7e7d8ca7771a71e498aa Mar 14 08:31:24 crc kubenswrapper[4886]: W0314 08:31:24.097258 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e008272_f83f_420d_848c_a05f8dfef580.slice/crio-2ecebd6f7bd2c61d7848b6d3d67a24c177f3340b84f214c9967df001cef781a3 WatchSource:0}: Error finding container 2ecebd6f7bd2c61d7848b6d3d67a24c177f3340b84f214c9967df001cef781a3: Status 404 returned error can't find the container with id 2ecebd6f7bd2c61d7848b6d3d67a24c177f3340b84f214c9967df001cef781a3 Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.124832 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.134885 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.135666 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.635644892 +0000 UTC m=+219.884096529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.164797 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47854: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.177352 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.217773 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qj8pq"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.219762 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" event={"ID":"2befe321-cfe9-4032-b949-3de718efbf7c","Type":"ContainerStarted","Data":"4aa2c91844589845a9d451f1af5b9a4f7da09088607caa5005cdfbe896805bd9"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.220037 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.242154 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.242840 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.742826435 +0000 UTC m=+219.991278072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: W0314 08:31:24.242943 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8476cbd2_082f_4766_94ed_8cf08a01c98a.slice/crio-489746a1ff581c2313eb1860dea390e92e0c3c892849d968841096941f9ef584 WatchSource:0}: Error finding container 489746a1ff581c2313eb1860dea390e92e0c3c892849d968841096941f9ef584: Status 404 returned error can't find the container with id 489746a1ff581c2313eb1860dea390e92e0c3c892849d968841096941f9ef584 Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.272986 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47856: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.274367 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" event={"ID":"5d14f041-e1b8-4b93-a893-946dbecf44aa","Type":"ContainerStarted","Data":"3c7f1979744c4ec9228e3890dcd76f24091edf6b0c10c3062e1156a6993616d6"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.277397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xcvj2" event={"ID":"c559a474-d30d-435f-8e7e-5f9d56884bac","Type":"ContainerStarted","Data":"50507f11ef2b5afdfc6809f12ffcd3b892cb8bf02707437edcfbbf5c42ba76ab"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.277425 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xcvj2" event={"ID":"c559a474-d30d-435f-8e7e-5f9d56884bac","Type":"ContainerStarted","Data":"0b1f2742dca1572d1c5361107797d2bc30d5689bba2abb3f3034dd6ba3927b00"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.284880 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" event={"ID":"6cf72daf-2f75-41d3-b94b-51479ba7d2cf","Type":"ContainerStarted","Data":"921edc56c8120f8ffdc6ae2528e056b46a53abc2cd4c287776d8a6fd9af313c5"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.287654 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hznhw" event={"ID":"feeab5ae-f3ec-4590-8625-00e98fb5064b","Type":"ContainerStarted","Data":"3ac5090267e9f4bac8a321a315d23d86b14beeb5d99be3e24ee8d4348f13623b"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.289468 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.302482 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.302535 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.304653 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xcvj2" podStartSLOduration=5.304637438 podStartE2EDuration="5.304637438s" podCreationTimestamp="2026-03-14 08:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.301841348 +0000 UTC m=+219.550292985" watchObservedRunningTime="2026-03-14 08:31:24.304637438 +0000 UTC m=+219.553089065" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.313061 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" event={"ID":"6047b878-24b2-43a1-8afb-3321319d2a1b","Type":"ContainerStarted","Data":"06e0e539d921136e4255620291c645038e841a08f62db57c91d5bf1b3b3260c0"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.313113 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" event={"ID":"6047b878-24b2-43a1-8afb-3321319d2a1b","Type":"ContainerStarted","Data":"684e5f043639b68cd7b7685b42f6854e38b56944d92432d29b2b5c12bba3f479"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.320074 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hznhw" podStartSLOduration=162.32005783 podStartE2EDuration="2m42.32005783s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.318736592 +0000 UTC m=+219.567188229" watchObservedRunningTime="2026-03-14 08:31:24.32005783 +0000 UTC m=+219.568509467" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.343692 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.343779 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.843757149 +0000 UTC m=+220.092208776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.343894 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.344257 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.844244093 +0000 UTC m=+220.092695730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.344573 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" event={"ID":"f276fd1b-0a21-47c7-95a4-7ccc355773ab","Type":"ContainerStarted","Data":"97bf90f55969884f38e4c817c78556e30b686f654d1fa6546aaaebdd095f8157"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.357996 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" event={"ID":"e10181e4-ca00-44ad-8153-7a2b1d8c7897","Type":"ContainerStarted","Data":"56d137970e84a49cc91aa95d7e68d73bdde3929daf114853becda964eb3dab98"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.358403 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" event={"ID":"e10181e4-ca00-44ad-8153-7a2b1d8c7897","Type":"ContainerStarted","Data":"93ae8edd4705280d436712c96b5da153f3e96f875d00938a8bb175c16737e882"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.382546 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c9qzb"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.391531 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" event={"ID":"dd360541-a4f3-4d2f-8085-e467feebb007","Type":"ContainerStarted","Data":"2356f3165793dd970f00f19fbd2b34e02836cfc39352fa0f6897c170e4fe9b6f"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.395301 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" event={"ID":"7fd40608-be63-40ae-9e2f-d0969c399390","Type":"ContainerStarted","Data":"5dcd79c052f4ce8aca3e199b9d985ba744a2865cd9b88d5d039f316fdc5eada8"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.397468 4886 generic.go:334] "Generic (PLEG): container finished" podID="a8352432-8b6e-4a89-b830-379796727237" containerID="146680225f427ddcbbc41aac4f49da903f37c674729514955c5b576f58d51349" exitCode=0 Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.397517 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" event={"ID":"a8352432-8b6e-4a89-b830-379796727237","Type":"ContainerDied","Data":"146680225f427ddcbbc41aac4f49da903f37c674729514955c5b576f58d51349"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.399829 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-d48xp" podStartSLOduration=162.399810757 podStartE2EDuration="2m42.399810757s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.383435787 +0000 UTC m=+219.631887434" watchObservedRunningTime="2026-03-14 08:31:24.399810757 +0000 UTC m=+219.648262384" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.402748 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" event={"ID":"ea7323d6-f41b-4251-ae88-aa34a5714182","Type":"ContainerStarted","Data":"57310ba521abc6995588c4e414e207667a42d6880f3f7e7d8ca7771a71e498aa"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.407464 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47868: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.416373 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-k7qk8" podStartSLOduration=162.416358311 podStartE2EDuration="2m42.416358311s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.412639255 +0000 UTC m=+219.661090892" watchObservedRunningTime="2026-03-14 08:31:24.416358311 +0000 UTC m=+219.664809948" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.419471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" event={"ID":"5a8def3c-80e8-4f81-8518-202af1613e6f","Type":"ContainerStarted","Data":"22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.420209 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.430566 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vdd8t"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.437477 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" event={"ID":"0228f0fd-9323-456c-9291-6150db291cf4","Type":"ContainerStarted","Data":"7c50080ba4319f59a7457f85e893d2e9577ea72e8abf770d561f5333152fb91b"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.438269 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.444724 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.446222 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:24.946203347 +0000 UTC m=+220.194654994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.460992 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47876: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.462509 4886 patch_prober.go:28] interesting pod/console-operator-58897d9998-lf5vf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.462556 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" podUID="0228f0fd-9323-456c-9291-6150db291cf4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.468974 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.469028 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q"] Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.486641 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" podStartSLOduration=162.486626886 podStartE2EDuration="2m42.486626886s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.486333928 +0000 UTC m=+219.734785565" watchObservedRunningTime="2026-03-14 08:31:24.486626886 +0000 UTC m=+219.735078523" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.506318 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" event={"ID":"488db7c4-ca9c-4229-9ac5-dc19ce5443e4","Type":"ContainerStarted","Data":"a9cfbed9c83b9b48bbef9927c438b6af985bba55b6f09ae8423e48e5a07464cf"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.507933 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" podStartSLOduration=162.507909066 podStartE2EDuration="2m42.507909066s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.505882958 +0000 UTC m=+219.754334595" watchObservedRunningTime="2026-03-14 08:31:24.507909066 +0000 UTC m=+219.756360703" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.511787 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-fd7np" event={"ID":"266a4a60-8fb5-4685-b4ac-621f93829611","Type":"ContainerStarted","Data":"e67e89d72e48304ef57b68d4a8664c1c86a1ca0731de077d4c22abe7ca353497"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.523616 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4ph2z" podStartSLOduration=162.523601026 podStartE2EDuration="2m42.523601026s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.521716202 +0000 UTC m=+219.770167839" watchObservedRunningTime="2026-03-14 08:31:24.523601026 +0000 UTC m=+219.772052663" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.529368 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" event={"ID":"a109bed2-b994-4808-acac-56741af98cca","Type":"ContainerStarted","Data":"2a9792dc59831a1c99e42fd3aed9d5656bf55080cbe006bcf5050cc6030139a1"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.543522 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" event={"ID":"df316737-4efd-4f41-a5ff-46740ee48d48","Type":"ContainerStarted","Data":"0b09a44534eca57c18e4a58f6a978b2a3202bd767242a751892f9048ae09a95d"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.546065 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.548079 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.048064798 +0000 UTC m=+220.296516435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.554990 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47884: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.566786 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" event={"ID":"2778f200-cefa-4b41-9bc5-f600415f2387","Type":"ContainerStarted","Data":"29627265da7f3e7422e5a8670285d5c254aceeaad8567fad06af967b3adbaf69"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.569067 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" event={"ID":"9e008272-f83f-420d-848c-a05f8dfef580","Type":"ContainerStarted","Data":"2ecebd6f7bd2c61d7848b6d3d67a24c177f3340b84f214c9967df001cef781a3"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.573171 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" event={"ID":"d7ad8d4f-d958-43b7-b84d-c8672642d21b","Type":"ContainerStarted","Data":"39570c129f1650fa07b216c5e683d1c44ac48e12801ca1f49ef7eb2a7fe8a733"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.573209 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" event={"ID":"d7ad8d4f-d958-43b7-b84d-c8672642d21b","Type":"ContainerStarted","Data":"c2d7e6b22f698da72d700619883b349767df8c34f2fb3dac0b8cc69d71ee2c99"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.592035 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.592767 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.595099 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" event={"ID":"39a6cfec-d809-46d0-8ef2-bde4b2d99c62","Type":"ContainerStarted","Data":"bfae6ab9732e5b9c520221ac12a9bfd94160d95333e09fafea1433df24276301"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.595427 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" event={"ID":"39a6cfec-d809-46d0-8ef2-bde4b2d99c62","Type":"ContainerStarted","Data":"60f8e2306d57ad0c61b677a2788224a35750322ea391ffe3ed3c47cfb06e2713"} Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.599285 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.611530 4886 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-n62ck container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.611687 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" podUID="39a6cfec-d809-46d0-8ef2-bde4b2d99c62" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.620993 4886 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gdqjc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]log ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]etcd ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/generic-apiserver-start-informers ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/max-in-flight-filter ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 14 08:31:24 crc kubenswrapper[4886]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 14 08:31:24 crc kubenswrapper[4886]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/project.openshift.io-projectcache ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/openshift.io-startinformers ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 14 08:31:24 crc kubenswrapper[4886]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 14 08:31:24 crc kubenswrapper[4886]: livez check failed Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.621087 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" podUID="ac520ac7-a6d3-4f1b-a99f-8d9af7ed1119" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.631342 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" podStartSLOduration=162.631322475 podStartE2EDuration="2m42.631322475s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.592477881 +0000 UTC m=+219.840929518" watchObservedRunningTime="2026-03-14 08:31:24.631322475 +0000 UTC m=+219.879774112" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.641523 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:24 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:24 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:24 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.641802 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.647349 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.648882 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.148853308 +0000 UTC m=+220.397304945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.650513 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" podStartSLOduration=162.650499665 podStartE2EDuration="2m42.650499665s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.648365684 +0000 UTC m=+219.896817321" watchObservedRunningTime="2026-03-14 08:31:24.650499665 +0000 UTC m=+219.898951302" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.652823 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" podStartSLOduration=84.652815821 podStartE2EDuration="1m24.652815821s" podCreationTimestamp="2026-03-14 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:24.632033555 +0000 UTC m=+219.880485192" watchObservedRunningTime="2026-03-14 08:31:24.652815821 +0000 UTC m=+219.901267448" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.656229 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47898: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.667321 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.669220 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.678975 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.750509 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.753220 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.253191669 +0000 UTC m=+220.501643506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.762273 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47902: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.857538 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47912: no serving certificate available for the kubelet" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.858359 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.859039 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.359017874 +0000 UTC m=+220.607469511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.942084 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:24 crc kubenswrapper[4886]: I0314 08:31:24.963055 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:24 crc kubenswrapper[4886]: E0314 08:31:24.963693 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.463678315 +0000 UTC m=+220.712129952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.064084 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.064447 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.564432033 +0000 UTC m=+220.812883670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.166432 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.167309 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.667285663 +0000 UTC m=+220.915737300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.269696 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.270104 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.770062239 +0000 UTC m=+221.018513876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.270242 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.270645 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.770634306 +0000 UTC m=+221.019085933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.372167 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.372382 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.872356093 +0000 UTC m=+221.120807730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.372634 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.372982 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.87297442 +0000 UTC m=+221.121426057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.474046 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.474509 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.97445283 +0000 UTC m=+221.222904467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.474899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.474967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.474990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.475317 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:25.975299884 +0000 UTC m=+221.223751521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.476888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.488278 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.532819 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47914: no serving certificate available for the kubelet" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.575604 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.575930 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.575984 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.576221 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.076182607 +0000 UTC m=+221.324634244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.586843 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.594553 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.608312 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.644184 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:25 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:25 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:25 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.644771 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.654094 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" event={"ID":"9270f20e-8365-43e2-9d4b-b067780c0804","Type":"ContainerStarted","Data":"f596ec3736b83ed1d05da5db565fecaab3aec9c7808636ab6212d73f5c662dea"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.675145 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" event={"ID":"dd360541-a4f3-4d2f-8085-e467feebb007","Type":"ContainerStarted","Data":"c14ab7e483d01c23f3d9be55c51bb71c9caab1477da03a4e1d3cd2a6548222b4"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.675226 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" event={"ID":"dd360541-a4f3-4d2f-8085-e467feebb007","Type":"ContainerStarted","Data":"7d59c16932b8f67a0279b915ca78877fd72f6d67856098acc9cd35ea7bc18ff5"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.675269 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.676600 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.676908 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.176897145 +0000 UTC m=+221.425348782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.705255 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" event={"ID":"8476cbd2-082f-4766-94ed-8cf08a01c98a","Type":"ContainerStarted","Data":"1e1868edfbad655587a20b8871920879fafb02710b0537a27930eb856905bb86"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.705337 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" event={"ID":"8476cbd2-082f-4766-94ed-8cf08a01c98a","Type":"ContainerStarted","Data":"489746a1ff581c2313eb1860dea390e92e0c3c892849d968841096941f9ef584"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.709963 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" event={"ID":"8b226bf0-ae7d-435b-9470-70dfb371f38e","Type":"ContainerStarted","Data":"b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.710925 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.717697 4886 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-t996r container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.717762 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" podUID="8b226bf0-ae7d-435b-9470-70dfb371f38e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.717942 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" event={"ID":"f276fd1b-0a21-47c7-95a4-7ccc355773ab","Type":"ContainerStarted","Data":"3cd54e5799c2db7e94305fe692e71894e80c1c36e38a5162c8cd97c8fd2bee9c"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.739952 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gm27b" event={"ID":"2778f200-cefa-4b41-9bc5-f600415f2387","Type":"ContainerStarted","Data":"41dc304feac60d360006f93f1600226c69ba030e4cfd478766c3cf65d2006547"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.752762 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" event={"ID":"86449941-3f6f-4c02-a717-ae9d48a7c464","Type":"ContainerStarted","Data":"fc121a6d023c9333c6fb2eb3c52141e91ee82d20a6bda1d09d3ec3ef8e3b5447"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.777170 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.781785 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.277396096 +0000 UTC m=+221.525847723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.782039 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.782395 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" event={"ID":"7fd40608-be63-40ae-9e2f-d0969c399390","Type":"ContainerStarted","Data":"4d8a819fc045e29b04295c38063f8db26b75618aa43e859187705d32083a5aef"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.783700 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.784677 4886 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2bkdt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.784753 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" podUID="7fd40608-be63-40ae-9e2f-d0969c399390" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.789748 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.28972732 +0000 UTC m=+221.538178957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.826003 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" event={"ID":"0822fc90-2e55-414e-8381-00d89382a00f","Type":"ContainerStarted","Data":"41ce5d1371dd4ce3db596e80a0c3d256d5ea3ef63ebfe0b6d3639ddb712281cc"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.826073 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" event={"ID":"0822fc90-2e55-414e-8381-00d89382a00f","Type":"ContainerStarted","Data":"373ee10906f422b7ab5b450a7567ec9cf0ba9e9345d99c1cd45791bbdeb46014"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.853907 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.869165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" event={"ID":"2befe321-cfe9-4032-b949-3de718efbf7c","Type":"ContainerStarted","Data":"e8be1e2aa622626e9de712105a82bcffe2488a7e5b90276b4ee7f2ce96d15cac"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.869218 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" event={"ID":"2befe321-cfe9-4032-b949-3de718efbf7c","Type":"ContainerStarted","Data":"3894b2c97af692d8f2e4a416133c581e4e812b59f6b3fc4b05429006c35e478a"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.872350 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" podStartSLOduration=163.872337679 podStartE2EDuration="2m43.872337679s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:25.870104505 +0000 UTC m=+221.118556152" watchObservedRunningTime="2026-03-14 08:31:25.872337679 +0000 UTC m=+221.120789326" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.885631 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.885809 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.885954 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.385935338 +0000 UTC m=+221.634386975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.886161 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.887057 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" event={"ID":"6cf72daf-2f75-41d3-b94b-51479ba7d2cf","Type":"ContainerStarted","Data":"555846c707e00ec41e38f1012b1866cecfde2c44b6211f9ab52b2cfee84ef269"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.887101 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" event={"ID":"6cf72daf-2f75-41d3-b94b-51479ba7d2cf","Type":"ContainerStarted","Data":"e5e3f715bad29f200cbf67422d067a40a8a361c24b5f4a7b0fd43f601cfd469c"} Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.887468 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.387457062 +0000 UTC m=+221.635908699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.896599 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" podStartSLOduration=163.896571023 podStartE2EDuration="2m43.896571023s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:25.892741124 +0000 UTC m=+221.141192761" watchObservedRunningTime="2026-03-14 08:31:25.896571023 +0000 UTC m=+221.145022660" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.901389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" event={"ID":"ea7323d6-f41b-4251-ae88-aa34a5714182","Type":"ContainerStarted","Data":"d94ba0cb504c18a1e91df4ef4675249b415e72659ca353fcbb769af633f56b26"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.902303 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.903877 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" event={"ID":"9e008272-f83f-420d-848c-a05f8dfef580","Type":"ContainerStarted","Data":"4ee7062bb0da87a0af92cbc59ea37ef467d847d389fe4faec4230cde06c4728e"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.903901 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" event={"ID":"9e008272-f83f-420d-848c-a05f8dfef580","Type":"ContainerStarted","Data":"a91a7b29212567f3af434377b25897afa3fe7d507dee821efb79b62e4b5a14fd"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.908900 4886 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8lxpz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.908969 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.924147 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" event={"ID":"a8352432-8b6e-4a89-b830-379796727237","Type":"ContainerStarted","Data":"23b189d3199a8d3d70507cd5bab5e788325384011bd9a96da3da2a39e4724f93"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.924509 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.932779 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" event={"ID":"3c3e4726-bb4a-45be-9c3a-a791c4a42380","Type":"ContainerStarted","Data":"7ac44bbbd7266f481b3d75fe9c5d187c97734a96fa1bada4d87eb8c398d5a30c"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.932864 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" event={"ID":"3c3e4726-bb4a-45be-9c3a-a791c4a42380","Type":"ContainerStarted","Data":"d9a6c0ad9849510f75997cdcadead0f8fed8347810c086be9faa5d82bfca1328"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.936645 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmcc2" event={"ID":"a312fb44-823b-44ec-8312-0d83b990e9cd","Type":"ContainerStarted","Data":"c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.939959 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c9qzb" event={"ID":"5a93d506-f295-45af-9692-27b4da556007","Type":"ContainerStarted","Data":"d28ef1e91c39df78ba511b84d075dcc390e152f61a71422aadc9250189b522ac"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.940014 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c9qzb" event={"ID":"5a93d506-f295-45af-9692-27b4da556007","Type":"ContainerStarted","Data":"897fb38ded24fd9d0274cde310a60011260121812ed7f4c934d9b858b1476f9e"} Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.942033 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" podStartSLOduration=163.942017587 podStartE2EDuration="2m43.942017587s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:25.938880767 +0000 UTC m=+221.187332404" watchObservedRunningTime="2026-03-14 08:31:25.942017587 +0000 UTC m=+221.190469224" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.986859 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" podStartSLOduration=163.986840562 podStartE2EDuration="2m43.986840562s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:25.979983695 +0000 UTC m=+221.228435332" watchObservedRunningTime="2026-03-14 08:31:25.986840562 +0000 UTC m=+221.235292199" Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.986938 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:25 crc kubenswrapper[4886]: E0314 08:31:25.988531 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.4885087 +0000 UTC m=+221.736960337 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:25 crc kubenswrapper[4886]: I0314 08:31:25.990503 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" event={"ID":"5d14f041-e1b8-4b93-a893-946dbecf44aa","Type":"ContainerStarted","Data":"6a030cdeb46582e0277e869aa7e6aea3d2995df484048dc099a2ab89dc8443b5"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.014644 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qj8pq" event={"ID":"6b322a91-a747-474d-9111-883781a4f012","Type":"ContainerStarted","Data":"6cb03f53d57f599c3800e1138d780bd2cff2f116d54a91f5e80cc5a4cabd1c40"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.014715 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qj8pq" event={"ID":"6b322a91-a747-474d-9111-883781a4f012","Type":"ContainerStarted","Data":"50f26d64470488a9dbcaca90247343bd6cff95d60daa9bdd4bba8920ba7d090b"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.018307 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9brkw" podStartSLOduration=164.018290263 podStartE2EDuration="2m44.018290263s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.016261745 +0000 UTC m=+221.264713382" watchObservedRunningTime="2026-03-14 08:31:26.018290263 +0000 UTC m=+221.266741900" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.027477 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" event={"ID":"a109bed2-b994-4808-acac-56741af98cca","Type":"ContainerStarted","Data":"e3beb3b7193c941feefe78a94ab0f1c209a56dcdd66c8f4be5eb4d3a4f0baf3a"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.052446 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" event={"ID":"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa","Type":"ContainerStarted","Data":"647da6729e1d99f89e619d303a24de9dad6b398a5f28020160fdba77a8ca96a5"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.052501 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" event={"ID":"3cb8a7d5-4437-460f-aad6-cf3b1df5a6fa","Type":"ContainerStarted","Data":"0a9a5a3100e95179fb64b361420080686df4c0063976fcb8b5a647533fca4ed0"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.053832 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9kl2c" podStartSLOduration=164.053802852 podStartE2EDuration="2m44.053802852s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.053464432 +0000 UTC m=+221.301916069" watchObservedRunningTime="2026-03-14 08:31:26.053802852 +0000 UTC m=+221.302254489" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.065858 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.065935 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.067593 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" event={"ID":"df316737-4efd-4f41-a5ff-46740ee48d48","Type":"ContainerStarted","Data":"bcf087c791e9267cd3a5107211f03d22e6eda623d076949f1197968c73259844"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.067641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" event={"ID":"df316737-4efd-4f41-a5ff-46740ee48d48","Type":"ContainerStarted","Data":"8a6fda37bd39de10ee053d0dd1c535e05e8da1e00b8f18b40e11e92a4f7c11d3"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.094789 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.095748 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.595721414 +0000 UTC m=+221.844173041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.097553 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2sj7h" podStartSLOduration=164.097525835 podStartE2EDuration="2m44.097525835s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.09386607 +0000 UTC m=+221.342317707" watchObservedRunningTime="2026-03-14 08:31:26.097525835 +0000 UTC m=+221.345977472" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.110152 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" event={"ID":"094b2153-374f-4595-ae06-2655b16d69b9","Type":"ContainerStarted","Data":"7d95c62dfe2dd8627eded8405a458fdff6ed75e06fce6df84f4f8d67dcb6531c"} Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.115196 4886 patch_prober.go:28] interesting pod/console-operator-58897d9998-lf5vf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.115254 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" podUID="0228f0fd-9323-456c-9291-6150db291cf4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.115559 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.115588 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.126624 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bgwrn" podStartSLOduration=164.126603309 podStartE2EDuration="2m44.126603309s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.12455552 +0000 UTC m=+221.373007157" watchObservedRunningTime="2026-03-14 08:31:26.126603309 +0000 UTC m=+221.375054946" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.131943 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j5zgg" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.196466 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.199292 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.699144429 +0000 UTC m=+221.947596066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.211324 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wmcc2" podStartSLOduration=164.211307698 podStartE2EDuration="2m44.211307698s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.166780561 +0000 UTC m=+221.415232198" watchObservedRunningTime="2026-03-14 08:31:26.211307698 +0000 UTC m=+221.459759325" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.232725 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n62ck" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.264745 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-j46q2" podStartSLOduration=164.264708809 podStartE2EDuration="2m44.264708809s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.21487549 +0000 UTC m=+221.463327127" watchObservedRunningTime="2026-03-14 08:31:26.264708809 +0000 UTC m=+221.513160446" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.294456 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5984c" podStartSLOduration=164.294436751 podStartE2EDuration="2m44.294436751s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.266587883 +0000 UTC m=+221.515039530" watchObservedRunningTime="2026-03-14 08:31:26.294436751 +0000 UTC m=+221.542888398" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.299707 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.300230 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.800207697 +0000 UTC m=+222.048659334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.345326 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" podStartSLOduration=164.34531012 podStartE2EDuration="2m44.34531012s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.309612217 +0000 UTC m=+221.558063854" watchObservedRunningTime="2026-03-14 08:31:26.34531012 +0000 UTC m=+221.593761757" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.401061 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kwc7b" podStartSLOduration=164.401027598 podStartE2EDuration="2m44.401027598s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.397529867 +0000 UTC m=+221.645981504" watchObservedRunningTime="2026-03-14 08:31:26.401027598 +0000 UTC m=+221.649479235" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.401654 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.402359 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:26.902337025 +0000 UTC m=+222.150788662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.416510 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jldpx" podStartSLOduration=164.41644279 podStartE2EDuration="2m44.41644279s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.358956011 +0000 UTC m=+221.607407648" watchObservedRunningTime="2026-03-14 08:31:26.41644279 +0000 UTC m=+221.664894447" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.435475 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" podStartSLOduration=164.435455445 podStartE2EDuration="2m44.435455445s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.423527783 +0000 UTC m=+221.671979420" watchObservedRunningTime="2026-03-14 08:31:26.435455445 +0000 UTC m=+221.683907082" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.448969 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qj8pq" podStartSLOduration=7.448954812 podStartE2EDuration="7.448954812s" podCreationTimestamp="2026-03-14 08:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.448259572 +0000 UTC m=+221.696711209" watchObservedRunningTime="2026-03-14 08:31:26.448954812 +0000 UTC m=+221.697406449" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.487253 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lxn5q" podStartSLOduration=164.48723589 podStartE2EDuration="2m44.48723589s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.486852469 +0000 UTC m=+221.735304106" watchObservedRunningTime="2026-03-14 08:31:26.48723589 +0000 UTC m=+221.735687527" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.506865 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.507499 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.007187602 +0000 UTC m=+222.255639239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.512627 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ql2q5" podStartSLOduration=164.512613257 podStartE2EDuration="2m44.512613257s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.511599788 +0000 UTC m=+221.760051425" watchObservedRunningTime="2026-03-14 08:31:26.512613257 +0000 UTC m=+221.761064884" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.551219 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-tfh9h" podStartSLOduration=164.551195293 podStartE2EDuration="2m44.551195293s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.550167804 +0000 UTC m=+221.798619441" watchObservedRunningTime="2026-03-14 08:31:26.551195293 +0000 UTC m=+221.799646950" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.564098 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.607677 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.608843 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.108827316 +0000 UTC m=+222.357278953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.630335 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" podStartSLOduration=164.630319402 podStartE2EDuration="2m44.630319402s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:26.628790218 +0000 UTC m=+221.877241855" watchObservedRunningTime="2026-03-14 08:31:26.630319402 +0000 UTC m=+221.878771039" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.654347 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:26 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:26 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:26 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.654421 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.712865 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.713307 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.213292831 +0000 UTC m=+222.461744468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.824732 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.824963 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.324940143 +0000 UTC m=+222.573391770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.825267 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.825556 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.32554514 +0000 UTC m=+222.573996767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.928611 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:26 crc kubenswrapper[4886]: I0314 08:31:26.928891 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47924: no serving certificate available for the kubelet" Mar 14 08:31:26 crc kubenswrapper[4886]: E0314 08:31:26.928896 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.428880173 +0000 UTC m=+222.677331810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.029645 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.029950 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.52993834 +0000 UTC m=+222.778389977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.130903 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.131097 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.63106926 +0000 UTC m=+222.879520887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.131597 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.131890 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.631879834 +0000 UTC m=+222.880331471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.151511 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c9qzb" event={"ID":"5a93d506-f295-45af-9692-27b4da556007","Type":"ContainerStarted","Data":"41e65c56ec8b5e977266dd79e1ca7cd9f87bdd407d3360fc40ddc8c35684ccaa"} Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.151761 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.155483 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rjdjq" event={"ID":"0822fc90-2e55-414e-8381-00d89382a00f","Type":"ContainerStarted","Data":"4cb44b82e3f33c492ca3b658d2e5d5e008a4bace0b3e3fc97a872072d6ad1324"} Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.159385 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5868cb7fce3abbccf2d4eb01a32b84357b6ccc3fec70cdd01f5fabc8dd6e8ca4"} Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.159413 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"084115a8ca088aef91e9376e926a94216c58fe8e0bfc5287d7c91f3da6a7edcd"} Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.168101 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" event={"ID":"86449941-3f6f-4c02-a717-ae9d48a7c464","Type":"ContainerStarted","Data":"7a24c18ae4d630866d9c3ca7423a05a77de6179d5c42c88103fa7efd774bb626"} Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.170445 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0e4f4c45dd60f92a311fad1036959d01bf02b38eccd47865a3bcf10534904b65"} Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.171730 4886 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8lxpz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.171743 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.171773 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.171780 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.177396 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.195834 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c9qzb" podStartSLOduration=9.195820347 podStartE2EDuration="9.195820347s" podCreationTimestamp="2026-03-14 08:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:27.193720587 +0000 UTC m=+222.442172224" watchObservedRunningTime="2026-03-14 08:31:27.195820347 +0000 UTC m=+222.444271984" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.232506 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.234360 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.734341211 +0000 UTC m=+222.982792848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.293354 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.340744 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2bkdt" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.358647 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.359140 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.859103769 +0000 UTC m=+223.107555406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.459945 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.460314 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:27.96029718 +0000 UTC m=+223.208748817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.561686 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.562028 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.062013486 +0000 UTC m=+223.310465123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.635833 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:27 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:27 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:27 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.635887 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.663528 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.664054 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.164030861 +0000 UTC m=+223.412482498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.679638 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r4qzm"] Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.679862 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" podUID="14c55450-dcee-4aee-8153-9ea2ff49b659" containerName="controller-manager" containerID="cri-o://765006db73285273cab077f8ba3ecf43432c0a444fde9b6e1a8b11a52d394487" gracePeriod=30 Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.724679 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx"] Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.765237 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.765847 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.26583557 +0000 UTC m=+223.514287197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.801145 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.801785 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.813148 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.817521 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.861969 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.866525 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.866667 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.36664263 +0000 UTC m=+223.615094267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.866769 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.866854 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3e06c71-44e4-476f-9f6c-96737f410dbf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.866901 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e06c71-44e4-476f-9f6c-96737f410dbf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.867064 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.367056972 +0000 UTC m=+223.615508609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.910165 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dk6zm"] Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.911073 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.915677 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.934707 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dk6zm"] Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.967553 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.967769 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e06c71-44e4-476f-9f6c-96737f410dbf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.967840 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-utilities\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.967879 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j99w9\" (UniqueName: \"kubernetes.io/projected/333059fe-3e95-4e08-b70e-d7d95e1ed279-kube-api-access-j99w9\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.967910 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3e06c71-44e4-476f-9f6c-96737f410dbf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.967930 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-catalog-content\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:27 crc kubenswrapper[4886]: E0314 08:31:27.968031 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.468011377 +0000 UTC m=+223.716463014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:27 crc kubenswrapper[4886]: I0314 08:31:27.968073 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e06c71-44e4-476f-9f6c-96737f410dbf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.012730 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3e06c71-44e4-476f-9f6c-96737f410dbf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.048167 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-brtwn"] Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.049106 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.050696 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-lf5vf" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.056376 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073415 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-utilities\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073479 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073502 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-utilities\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073520 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrk9\" (UniqueName: \"kubernetes.io/projected/aeaec6eb-91cb-4e68-807f-994b4e9df360-kube-api-access-kfrk9\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073549 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j99w9\" (UniqueName: \"kubernetes.io/projected/333059fe-3e95-4e08-b70e-d7d95e1ed279-kube-api-access-j99w9\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073581 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-catalog-content\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.073600 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-catalog-content\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.073861 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.573850412 +0000 UTC m=+223.822302049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.074230 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-utilities\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.074605 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-catalog-content\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.121924 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-brtwn"] Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.125237 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-knppr" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.125515 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.148543 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j99w9\" (UniqueName: \"kubernetes.io/projected/333059fe-3e95-4e08-b70e-d7d95e1ed279-kube-api-access-j99w9\") pod \"community-operators-dk6zm\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.171558 4886 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-x4qmh container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.171633 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" podUID="a8352432-8b6e-4a89-b830-379796727237" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.175249 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.175538 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-catalog-content\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.175583 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-utilities\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.175654 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrk9\" (UniqueName: \"kubernetes.io/projected/aeaec6eb-91cb-4e68-807f-994b4e9df360-kube-api-access-kfrk9\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.176156 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.676137485 +0000 UTC m=+223.924589112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.176639 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-utilities\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.177251 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-catalog-content\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.219631 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f44769b2add33610cba467bf83b91b25fe871b53cc7a159e9f3ae4a36f91cf15"} Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.236734 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.244780 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrk9\" (UniqueName: \"kubernetes.io/projected/aeaec6eb-91cb-4e68-807f-994b4e9df360-kube-api-access-kfrk9\") pod \"certified-operators-brtwn\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.273346 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"186cc74d00d4c2564a731209d5dd031dbaa1cecf0ece313058448c58b18bb77c"} Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.273426 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b50522398c5b0887ff8d46a9cb76e886f8a3c763d5332524660bd4fe35c898c6"} Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.274655 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.297871 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.299444 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.79942936 +0000 UTC m=+224.047880997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.301256 4886 generic.go:334] "Generic (PLEG): container finished" podID="14c55450-dcee-4aee-8153-9ea2ff49b659" containerID="765006db73285273cab077f8ba3ecf43432c0a444fde9b6e1a8b11a52d394487" exitCode=0 Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.301562 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" podUID="5a8def3c-80e8-4f81-8518-202af1613e6f" containerName="route-controller-manager" containerID="cri-o://22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98" gracePeriod=30 Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.301896 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" event={"ID":"14c55450-dcee-4aee-8153-9ea2ff49b659","Type":"ContainerDied","Data":"765006db73285273cab077f8ba3ecf43432c0a444fde9b6e1a8b11a52d394487"} Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.303401 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x4qmh" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.326240 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-snhqq"] Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.327411 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.328175 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.368683 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snhqq"] Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.402268 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.402442 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.902413773 +0000 UTC m=+224.150865410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.402626 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspdc\" (UniqueName: \"kubernetes.io/projected/758a60d0-6132-4b23-8062-febd479f7fff-kube-api-access-gspdc\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.402664 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.402795 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-catalog-content\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.402843 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-utilities\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.425274 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.425848 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:28.925814694 +0000 UTC m=+224.174266331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.461284 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4gbt"] Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.464445 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.490599 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4gbt"] Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.507966 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.524730 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.525008 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspdc\" (UniqueName: \"kubernetes.io/projected/758a60d0-6132-4b23-8062-febd479f7fff-kube-api-access-gspdc\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.525066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-catalog-content\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.525088 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-utilities\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.525524 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-utilities\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.525595 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.025578324 +0000 UTC m=+224.274029961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.526242 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-catalog-content\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.601043 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspdc\" (UniqueName: \"kubernetes.io/projected/758a60d0-6132-4b23-8062-febd479f7fff-kube-api-access-gspdc\") pod \"community-operators-snhqq\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.626831 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzg52\" (UniqueName: \"kubernetes.io/projected/14c55450-dcee-4aee-8153-9ea2ff49b659-kube-api-access-pzg52\") pod \"14c55450-dcee-4aee-8153-9ea2ff49b659\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.626944 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c55450-dcee-4aee-8153-9ea2ff49b659-serving-cert\") pod \"14c55450-dcee-4aee-8153-9ea2ff49b659\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627190 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-config\") pod \"14c55450-dcee-4aee-8153-9ea2ff49b659\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627251 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-client-ca\") pod \"14c55450-dcee-4aee-8153-9ea2ff49b659\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627333 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-proxy-ca-bundles\") pod \"14c55450-dcee-4aee-8153-9ea2ff49b659\" (UID: \"14c55450-dcee-4aee-8153-9ea2ff49b659\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627618 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627648 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-utilities\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627692 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fk8\" (UniqueName: \"kubernetes.io/projected/837659fc-08c6-4ea6-8799-aa4297b20689-kube-api-access-n7fk8\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.627712 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-catalog-content\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.629137 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "14c55450-dcee-4aee-8153-9ea2ff49b659" (UID: "14c55450-dcee-4aee-8153-9ea2ff49b659"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.629540 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.129523275 +0000 UTC m=+224.377974922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.629559 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-config" (OuterVolumeSpecName: "config") pod "14c55450-dcee-4aee-8153-9ea2ff49b659" (UID: "14c55450-dcee-4aee-8153-9ea2ff49b659"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.629632 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-client-ca" (OuterVolumeSpecName: "client-ca") pod "14c55450-dcee-4aee-8153-9ea2ff49b659" (UID: "14c55450-dcee-4aee-8153-9ea2ff49b659"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.646565 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c55450-dcee-4aee-8153-9ea2ff49b659-kube-api-access-pzg52" (OuterVolumeSpecName: "kube-api-access-pzg52") pod "14c55450-dcee-4aee-8153-9ea2ff49b659" (UID: "14c55450-dcee-4aee-8153-9ea2ff49b659"). InnerVolumeSpecName "kube-api-access-pzg52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.654617 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c55450-dcee-4aee-8153-9ea2ff49b659-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14c55450-dcee-4aee-8153-9ea2ff49b659" (UID: "14c55450-dcee-4aee-8153-9ea2ff49b659"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.657097 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:28 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:28 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:28 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.657187 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.731335 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.731687 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fk8\" (UniqueName: \"kubernetes.io/projected/837659fc-08c6-4ea6-8799-aa4297b20689-kube-api-access-n7fk8\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.731712 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-catalog-content\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.731836 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.231800627 +0000 UTC m=+224.480252264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732034 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732071 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-utilities\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732186 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-catalog-content\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732637 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzg52\" (UniqueName: \"kubernetes.io/projected/14c55450-dcee-4aee-8153-9ea2ff49b659-kube-api-access-pzg52\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732680 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14c55450-dcee-4aee-8153-9ea2ff49b659-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.732915 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.232908039 +0000 UTC m=+224.481359676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732943 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732957 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.732967 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14c55450-dcee-4aee-8153-9ea2ff49b659-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.733378 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-utilities\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.769520 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fk8\" (UniqueName: \"kubernetes.io/projected/837659fc-08c6-4ea6-8799-aa4297b20689-kube-api-access-n7fk8\") pod \"certified-operators-t4gbt\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.788312 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.822291 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.838810 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.839309 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.339258158 +0000 UTC m=+224.587709795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:28 crc kubenswrapper[4886]: I0314 08:31:28.948995 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:28 crc kubenswrapper[4886]: E0314 08:31:28.949513 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.449497329 +0000 UTC m=+224.697948966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.031929 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.053855 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.055704 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.555683374 +0000 UTC m=+224.804135011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.156897 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kg7k\" (UniqueName: \"kubernetes.io/projected/5a8def3c-80e8-4f81-8518-202af1613e6f-kube-api-access-4kg7k\") pod \"5a8def3c-80e8-4f81-8518-202af1613e6f\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.157229 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-config\") pod \"5a8def3c-80e8-4f81-8518-202af1613e6f\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.157261 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8def3c-80e8-4f81-8518-202af1613e6f-serving-cert\") pod \"5a8def3c-80e8-4f81-8518-202af1613e6f\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.157296 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-client-ca\") pod \"5a8def3c-80e8-4f81-8518-202af1613e6f\" (UID: \"5a8def3c-80e8-4f81-8518-202af1613e6f\") " Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.157690 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.158793 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a8def3c-80e8-4f81-8518-202af1613e6f" (UID: "5a8def3c-80e8-4f81-8518-202af1613e6f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.159351 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-config" (OuterVolumeSpecName: "config") pod "5a8def3c-80e8-4f81-8518-202af1613e6f" (UID: "5a8def3c-80e8-4f81-8518-202af1613e6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.159995 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.659977064 +0000 UTC m=+224.908428881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.175041 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8def3c-80e8-4f81-8518-202af1613e6f-kube-api-access-4kg7k" (OuterVolumeSpecName: "kube-api-access-4kg7k") pod "5a8def3c-80e8-4f81-8518-202af1613e6f" (UID: "5a8def3c-80e8-4f81-8518-202af1613e6f"). InnerVolumeSpecName "kube-api-access-4kg7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.176479 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8def3c-80e8-4f81-8518-202af1613e6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a8def3c-80e8-4f81-8518-202af1613e6f" (UID: "5a8def3c-80e8-4f81-8518-202af1613e6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.259013 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.259369 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.759345424 +0000 UTC m=+225.007797061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.259535 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kg7k\" (UniqueName: \"kubernetes.io/projected/5a8def3c-80e8-4f81-8518-202af1613e6f-kube-api-access-4kg7k\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.259549 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.259558 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8def3c-80e8-4f81-8518-202af1613e6f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.259566 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a8def3c-80e8-4f81-8518-202af1613e6f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.280813 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dk6zm"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.283602 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.288257 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-brtwn"] Mar 14 08:31:29 crc kubenswrapper[4886]: W0314 08:31:29.294825 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaeaec6eb_91cb_4e68_807f_994b4e9df360.slice/crio-de7195b6639958d8f38e9c07208bd9032aee68af4f66d36ee6b2923bd3222336 WatchSource:0}: Error finding container de7195b6639958d8f38e9c07208bd9032aee68af4f66d36ee6b2923bd3222336: Status 404 returned error can't find the container with id de7195b6639958d8f38e9c07208bd9032aee68af4f66d36ee6b2923bd3222336 Mar 14 08:31:29 crc kubenswrapper[4886]: W0314 08:31:29.297771 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod333059fe_3e95_4e08_b70e_d7d95e1ed279.slice/crio-78278b60f8506a5adf01d07fb614619d230bf1489aba2cecfd9c1997eb1dab74 WatchSource:0}: Error finding container 78278b60f8506a5adf01d07fb614619d230bf1489aba2cecfd9c1997eb1dab74: Status 404 returned error can't find the container with id 78278b60f8506a5adf01d07fb614619d230bf1489aba2cecfd9c1997eb1dab74 Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.318861 4886 generic.go:334] "Generic (PLEG): container finished" podID="5a8def3c-80e8-4f81-8518-202af1613e6f" containerID="22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98" exitCode=0 Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.318964 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" event={"ID":"5a8def3c-80e8-4f81-8518-202af1613e6f","Type":"ContainerDied","Data":"22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.319006 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" event={"ID":"5a8def3c-80e8-4f81-8518-202af1613e6f","Type":"ContainerDied","Data":"c0cfe7744c88a113dbadf53a337b50e1e245c439ff304176198462c0dc2ef35b"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.319052 4886 scope.go:117] "RemoveContainer" containerID="22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.319518 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.332859 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.332982 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r4qzm" event={"ID":"14c55450-dcee-4aee-8153-9ea2ff49b659","Type":"ContainerDied","Data":"f5f8d949c25ff10ebe26e5d396230a24ae728c8917ef0f16f67b40064b20aabf"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.345884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerStarted","Data":"78278b60f8506a5adf01d07fb614619d230bf1489aba2cecfd9c1997eb1dab74"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.353644 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.354082 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brtwn" event={"ID":"aeaec6eb-91cb-4e68-807f-994b4e9df360","Type":"ContainerStarted","Data":"de7195b6639958d8f38e9c07208bd9032aee68af4f66d36ee6b2923bd3222336"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.360870 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dknmx"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.361771 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.362519 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.862494031 +0000 UTC m=+225.110945668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.370429 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r4qzm"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.377642 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r4qzm"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.382282 4886 scope.go:117] "RemoveContainer" containerID="22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.384262 4886 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.390498 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98\": container with ID starting with 22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98 not found: ID does not exist" containerID="22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.390600 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98"} err="failed to get container status \"22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98\": rpc error: code = NotFound desc = could not find container \"22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98\": container with ID starting with 22469caae6babc158c55600720b441215a82899befab2c19780ee5c7b4fc5c98 not found: ID does not exist" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.390663 4886 scope.go:117] "RemoveContainer" containerID="765006db73285273cab077f8ba3ecf43432c0a444fde9b6e1a8b11a52d394487" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.392805 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" event={"ID":"86449941-3f6f-4c02-a717-ae9d48a7c464","Type":"ContainerStarted","Data":"23315f23f30399b2641bb387399517277a0629ec42c126431cfeb55402eafded"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.392840 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" event={"ID":"86449941-3f6f-4c02-a717-ae9d48a7c464","Type":"ContainerStarted","Data":"b78656f59138b2ac8eb6fa6ec0db0f91521524a7f214474032a4e5551c95564a"} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.404499 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snhqq"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.416496 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4gbt"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.434037 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c55450-dcee-4aee-8153-9ea2ff49b659" path="/var/lib/kubelet/pods/14c55450-dcee-4aee-8153-9ea2ff49b659/volumes" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.434863 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8def3c-80e8-4f81-8518-202af1613e6f" path="/var/lib/kubelet/pods/5a8def3c-80e8-4f81-8518-202af1613e6f/volumes" Mar 14 08:31:29 crc kubenswrapper[4886]: W0314 08:31:29.437629 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758a60d0_6132_4b23_8062_febd479f7fff.slice/crio-3e3ff19467f1b986bd0bbd85b7d6035a94ec405b102d536647c6bbc6170aa850 WatchSource:0}: Error finding container 3e3ff19467f1b986bd0bbd85b7d6035a94ec405b102d536647c6bbc6170aa850: Status 404 returned error can't find the container with id 3e3ff19467f1b986bd0bbd85b7d6035a94ec405b102d536647c6bbc6170aa850 Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.462795 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.463270 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-14 08:31:29.96325196 +0000 UTC m=+225.211703597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.546100 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47928: no serving certificate available for the kubelet" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.565240 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.569901 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-14 08:31:30.069862037 +0000 UTC m=+225.318313674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wscrd" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.601426 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.610162 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gdqjc" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.634854 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:29 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:29 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:29 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.634907 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.636490 4886 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-14T08:31:29.384276686Z","Handler":null,"Name":""} Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.643035 4886 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.643293 4886 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.668700 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.716036 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.717938 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fc8bf5689-szgl2"] Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.718265 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c55450-dcee-4aee-8153-9ea2ff49b659" containerName="controller-manager" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.718284 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c55450-dcee-4aee-8153-9ea2ff49b659" containerName="controller-manager" Mar 14 08:31:29 crc kubenswrapper[4886]: E0314 08:31:29.718299 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8def3c-80e8-4f81-8518-202af1613e6f" containerName="route-controller-manager" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.718308 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8def3c-80e8-4f81-8518-202af1613e6f" containerName="route-controller-manager" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.718428 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8def3c-80e8-4f81-8518-202af1613e6f" containerName="route-controller-manager" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.718442 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c55450-dcee-4aee-8153-9ea2ff49b659" containerName="controller-manager" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.718888 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.722951 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.723605 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.736613 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742157 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742346 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742587 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742776 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742894 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742993 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.745221 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.745333 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.742167 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.745589 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.745632 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.746016 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.749872 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fc8bf5689-szgl2"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.772021 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.772902 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.801314 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.801349 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.863050 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fpr95"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.864681 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.868851 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpr95"] Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.869339 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.873987 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-client-ca\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874029 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-config\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874076 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-proxy-ca-bundles\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874110 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6fvl\" (UniqueName: \"kubernetes.io/projected/bd789bc6-70ed-439a-a3a6-83984147f97b-kube-api-access-b6fvl\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874151 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-client-ca\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874461 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjs5f\" (UniqueName: \"kubernetes.io/projected/845227fa-82cd-4957-92a9-164d2f2bba1d-kube-api-access-wjs5f\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874495 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd789bc6-70ed-439a-a3a6-83984147f97b-serving-cert\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874598 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-config\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.874628 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/845227fa-82cd-4957-92a9-164d2f2bba1d-serving-cert\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.892293 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wscrd\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976452 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjs5f\" (UniqueName: \"kubernetes.io/projected/845227fa-82cd-4957-92a9-164d2f2bba1d-kube-api-access-wjs5f\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976499 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd789bc6-70ed-439a-a3a6-83984147f97b-serving-cert\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976535 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-config\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976560 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/845227fa-82cd-4957-92a9-164d2f2bba1d-serving-cert\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976584 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-client-ca\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976604 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-config\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976736 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-catalog-content\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976781 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7d2\" (UniqueName: \"kubernetes.io/projected/98f142ad-f9c5-41ee-81ec-632938796964-kube-api-access-rb7d2\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976802 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-proxy-ca-bundles\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976824 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6fvl\" (UniqueName: \"kubernetes.io/projected/bd789bc6-70ed-439a-a3a6-83984147f97b-kube-api-access-b6fvl\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976846 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-client-ca\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.976865 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-utilities\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.977710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-client-ca\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.978161 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-config\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.978404 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-config\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.978974 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-client-ca\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.979196 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-proxy-ca-bundles\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.986884 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/845227fa-82cd-4957-92a9-164d2f2bba1d-serving-cert\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:29 crc kubenswrapper[4886]: I0314 08:31:29.994805 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd789bc6-70ed-439a-a3a6-83984147f97b-serving-cert\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.000861 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjs5f\" (UniqueName: \"kubernetes.io/projected/845227fa-82cd-4957-92a9-164d2f2bba1d-kube-api-access-wjs5f\") pod \"route-controller-manager-5756dc695d-28vwz\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.000948 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6fvl\" (UniqueName: \"kubernetes.io/projected/bd789bc6-70ed-439a-a3a6-83984147f97b-kube-api-access-b6fvl\") pod \"controller-manager-6fc8bf5689-szgl2\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.069437 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.069503 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.077648 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-utilities\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.077978 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-catalog-content\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.078017 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7d2\" (UniqueName: \"kubernetes.io/projected/98f142ad-f9c5-41ee-81ec-632938796964-kube-api-access-rb7d2\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.078052 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-utilities\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.078279 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-catalog-content\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.091572 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.099152 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7d2\" (UniqueName: \"kubernetes.io/projected/98f142ad-f9c5-41ee-81ec-632938796964-kube-api-access-rb7d2\") pod \"redhat-marketplace-fpr95\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.236960 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vnjw4"] Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.238074 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.255034 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnjw4"] Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.264236 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.382166 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbwq\" (UniqueName: \"kubernetes.io/projected/9306248c-2771-4cb6-bdd4-f8628c2b6428-kube-api-access-zwbwq\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.382233 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-utilities\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.382260 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-catalog-content\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.423315 4886 generic.go:334] "Generic (PLEG): container finished" podID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerID="01a4e1c02a75c867b930eb8d20a4f916e28c505fda422a7af3c01dde2b8d77fd" exitCode=0 Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.423394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerDied","Data":"01a4e1c02a75c867b930eb8d20a4f916e28c505fda422a7af3c01dde2b8d77fd"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.426335 4886 generic.go:334] "Generic (PLEG): container finished" podID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerID="d6d672f06fd68000fc57fdeee67e4ec0b65c7a90f9ce5e842fd8578a517d9fd3" exitCode=0 Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.426380 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brtwn" event={"ID":"aeaec6eb-91cb-4e68-807f-994b4e9df360","Type":"ContainerDied","Data":"d6d672f06fd68000fc57fdeee67e4ec0b65c7a90f9ce5e842fd8578a517d9fd3"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.430791 4886 generic.go:334] "Generic (PLEG): container finished" podID="837659fc-08c6-4ea6-8799-aa4297b20689" containerID="02f6f0e3d0573881cf8e838e35f2e366725f66967bf323d5a339d61a6c817d2b" exitCode=0 Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.430828 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerDied","Data":"02f6f0e3d0573881cf8e838e35f2e366725f66967bf323d5a339d61a6c817d2b"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.430844 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerStarted","Data":"663cbe6cd1053881b64be17e64ba7071aaf6ef0553582f93d804b9844e7b792c"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.433985 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" event={"ID":"86449941-3f6f-4c02-a717-ae9d48a7c464","Type":"ContainerStarted","Data":"4235d67bac2a79da1e8f408ee205af3f6b1f36fef57952d89997b89bdf40ba26"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.437146 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a3e06c71-44e4-476f-9f6c-96737f410dbf","Type":"ContainerStarted","Data":"67d6ffd8e31b79b383e20a50556d40a5b4b6b0b4a3f9b5ba30e449a37ad50909"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.437169 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a3e06c71-44e4-476f-9f6c-96737f410dbf","Type":"ContainerStarted","Data":"2f4e78259e3278f7c47722ca9b5164c86d0c019dae0f662b62adc3a7c1fb36f1"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.455503 4886 generic.go:334] "Generic (PLEG): container finished" podID="758a60d0-6132-4b23-8062-febd479f7fff" containerID="941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe" exitCode=0 Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.455955 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snhqq" event={"ID":"758a60d0-6132-4b23-8062-febd479f7fff","Type":"ContainerDied","Data":"941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.456060 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snhqq" event={"ID":"758a60d0-6132-4b23-8062-febd479f7fff","Type":"ContainerStarted","Data":"3e3ff19467f1b986bd0bbd85b7d6035a94ec405b102d536647c6bbc6170aa850"} Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.469753 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.469734569 podStartE2EDuration="3.469734569s" podCreationTimestamp="2026-03-14 08:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:30.466813206 +0000 UTC m=+225.715264843" watchObservedRunningTime="2026-03-14 08:31:30.469734569 +0000 UTC m=+225.718186206" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.484795 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbwq\" (UniqueName: \"kubernetes.io/projected/9306248c-2771-4cb6-bdd4-f8628c2b6428-kube-api-access-zwbwq\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.484876 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-utilities\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.484898 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-catalog-content\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.492502 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-catalog-content\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.496537 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-utilities\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.503545 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vdd8t" podStartSLOduration=12.503527488 podStartE2EDuration="12.503527488s" podCreationTimestamp="2026-03-14 08:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:30.499555044 +0000 UTC m=+225.748006681" watchObservedRunningTime="2026-03-14 08:31:30.503527488 +0000 UTC m=+225.751979125" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.509920 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbwq\" (UniqueName: \"kubernetes.io/projected/9306248c-2771-4cb6-bdd4-f8628c2b6428-kube-api-access-zwbwq\") pod \"redhat-marketplace-vnjw4\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.511938 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fc8bf5689-szgl2"] Mar 14 08:31:30 crc kubenswrapper[4886]: W0314 08:31:30.515059 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd789bc6_70ed_439a_a3a6_83984147f97b.slice/crio-1f43f1a3c5fc8606ab5d1674447b39ec66c21b3571432d37bd7531ea3dd78b68 WatchSource:0}: Error finding container 1f43f1a3c5fc8606ab5d1674447b39ec66c21b3571432d37bd7531ea3dd78b68: Status 404 returned error can't find the container with id 1f43f1a3c5fc8606ab5d1674447b39ec66c21b3571432d37bd7531ea3dd78b68 Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.557968 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.643562 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:30 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:30 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:30 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.644024 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.689512 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz"] Mar 14 08:31:30 crc kubenswrapper[4886]: W0314 08:31:30.712725 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod845227fa_82cd_4957_92a9_164d2f2bba1d.slice/crio-36e02ff849abbf46660ff74eff69d58456cf87d67447007be53b2877e5292e73 WatchSource:0}: Error finding container 36e02ff849abbf46660ff74eff69d58456cf87d67447007be53b2877e5292e73: Status 404 returned error can't find the container with id 36e02ff849abbf46660ff74eff69d58456cf87d67447007be53b2877e5292e73 Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.817455 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpr95"] Mar 14 08:31:30 crc kubenswrapper[4886]: I0314 08:31:30.820520 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wscrd"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.004487 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnjw4"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.246558 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6gfbq"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.248004 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.272700 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.289807 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.289965 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.292185 4886 patch_prober.go:28] interesting pod/console-f9d7485db-wmcc2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.292225 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wmcc2" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.292900 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gfbq"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.408355 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-utilities\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.408466 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlzjm\" (UniqueName: \"kubernetes.io/projected/7c355048-396b-4f00-8ff6-1ffff1d9d62c-kube-api-access-vlzjm\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.408504 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-catalog-content\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.458400 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.482532 4886 generic.go:334] "Generic (PLEG): container finished" podID="98f142ad-f9c5-41ee-81ec-632938796964" containerID="303bb5cda821a79eff730f4d7df823fd10323f9ff64a6c848b3af17dea088d70" exitCode=0 Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.482665 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpr95" event={"ID":"98f142ad-f9c5-41ee-81ec-632938796964","Type":"ContainerDied","Data":"303bb5cda821a79eff730f4d7df823fd10323f9ff64a6c848b3af17dea088d70"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.482710 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpr95" event={"ID":"98f142ad-f9c5-41ee-81ec-632938796964","Type":"ContainerStarted","Data":"7e2a5ebe2ad8e8f12833449807a247ced82971a38b19b39fb9b89952c232c182"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.485040 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" event={"ID":"cfbeb8db-2612-468d-8354-32ee6373f57e","Type":"ContainerStarted","Data":"f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.485063 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" event={"ID":"cfbeb8db-2612-468d-8354-32ee6373f57e","Type":"ContainerStarted","Data":"d3ee87034513ebbef9d3bc79bbd7eca0d02de1190860b50d3a519b11e7b9a62b"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.485487 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.488178 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" event={"ID":"bd789bc6-70ed-439a-a3a6-83984147f97b","Type":"ContainerStarted","Data":"d8960ea2c1991ffdcc473f6be2459de8ae762563424a6cbea3338eec69eb202e"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.488200 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" event={"ID":"bd789bc6-70ed-439a-a3a6-83984147f97b","Type":"ContainerStarted","Data":"1f43f1a3c5fc8606ab5d1674447b39ec66c21b3571432d37bd7531ea3dd78b68"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.489403 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.497364 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.499594 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" event={"ID":"845227fa-82cd-4957-92a9-164d2f2bba1d","Type":"ContainerStarted","Data":"c8349ddd9de63e74f232fc9a4ada283cff013e2f085119ee7c6eb129950a3aac"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.499634 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" event={"ID":"845227fa-82cd-4957-92a9-164d2f2bba1d","Type":"ContainerStarted","Data":"36e02ff849abbf46660ff74eff69d58456cf87d67447007be53b2877e5292e73"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.500600 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.509718 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-utilities\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.509786 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlzjm\" (UniqueName: \"kubernetes.io/projected/7c355048-396b-4f00-8ff6-1ffff1d9d62c-kube-api-access-vlzjm\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.509828 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-catalog-content\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.510263 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-utilities\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.515560 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-catalog-content\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.516487 4886 generic.go:334] "Generic (PLEG): container finished" podID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerID="70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf" exitCode=0 Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.516682 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerDied","Data":"70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.516741 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerStarted","Data":"364d714d099fe1f3cf62a5d24d6fee366c8e77efbd1718e4874a322280cfb654"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.519664 4886 generic.go:334] "Generic (PLEG): container finished" podID="a3e06c71-44e4-476f-9f6c-96737f410dbf" containerID="67d6ffd8e31b79b383e20a50556d40a5b4b6b0b4a3f9b5ba30e449a37ad50909" exitCode=0 Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.520434 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a3e06c71-44e4-476f-9f6c-96737f410dbf","Type":"ContainerDied","Data":"67d6ffd8e31b79b383e20a50556d40a5b4b6b0b4a3f9b5ba30e449a37ad50909"} Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.537382 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlzjm\" (UniqueName: \"kubernetes.io/projected/7c355048-396b-4f00-8ff6-1ffff1d9d62c-kube-api-access-vlzjm\") pod \"redhat-operators-6gfbq\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.560542 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.582595 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" podStartSLOduration=3.582573907 podStartE2EDuration="3.582573907s" podCreationTimestamp="2026-03-14 08:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:31.533354326 +0000 UTC m=+226.781805963" watchObservedRunningTime="2026-03-14 08:31:31.582573907 +0000 UTC m=+226.831025544" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.583100 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" podStartSLOduration=169.583095912 podStartE2EDuration="2m49.583095912s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:31.580339603 +0000 UTC m=+226.828791240" watchObservedRunningTime="2026-03-14 08:31:31.583095912 +0000 UTC m=+226.831547549" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.610860 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" podStartSLOduration=3.610818177 podStartE2EDuration="3.610818177s" podCreationTimestamp="2026-03-14 08:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:31.599936725 +0000 UTC m=+226.848388362" watchObservedRunningTime="2026-03-14 08:31:31.610818177 +0000 UTC m=+226.859269814" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.642619 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.646743 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.646811 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.647161 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.647185 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.653333 4886 patch_prober.go:28] interesting pod/router-default-5444994796-8lqwb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 14 08:31:31 crc kubenswrapper[4886]: [-]has-synced failed: reason withheld Mar 14 08:31:31 crc kubenswrapper[4886]: [+]process-running ok Mar 14 08:31:31 crc kubenswrapper[4886]: healthz check failed Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.653406 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8lqwb" podUID="5fb33ac2-d4aa-49b0-9007-33af11834a96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.663449 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4f8zx"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.666147 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.677129 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4f8zx"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.711973 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.716214 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.722160 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.722458 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.725040 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.814268 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjpw\" (UniqueName: \"kubernetes.io/projected/28585f97-68cd-440f-acb0-0e5bd9117023-kube-api-access-kzjpw\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.814353 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-utilities\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.814373 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-catalog-content\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.814449 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.814467 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.846565 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.915990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.916040 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.916070 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjpw\" (UniqueName: \"kubernetes.io/projected/28585f97-68cd-440f-acb0-0e5bd9117023-kube-api-access-kzjpw\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.916110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-utilities\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.916144 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-catalog-content\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.916965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-catalog-content\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.917059 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.917748 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-utilities\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.945809 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:31 crc kubenswrapper[4886]: I0314 08:31:31.956681 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjpw\" (UniqueName: \"kubernetes.io/projected/28585f97-68cd-440f-acb0-0e5bd9117023-kube-api-access-kzjpw\") pod \"redhat-operators-4f8zx\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.022303 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.036453 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.131322 4886 ???:1] "http: TLS handshake error from 192.168.126.11:47934: no serving certificate available for the kubelet" Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.217536 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6gfbq"] Mar 14 08:31:32 crc kubenswrapper[4886]: W0314 08:31:32.296306 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c355048_396b_4f00_8ff6_1ffff1d9d62c.slice/crio-2e1fead50a7ebf7adff0fbed317a8fd55c96e153d6ee602449ce84b0372d4a3e WatchSource:0}: Error finding container 2e1fead50a7ebf7adff0fbed317a8fd55c96e153d6ee602449ce84b0372d4a3e: Status 404 returned error can't find the container with id 2e1fead50a7ebf7adff0fbed317a8fd55c96e153d6ee602449ce84b0372d4a3e Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.562894 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4f8zx"] Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.579585 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfbq" event={"ID":"7c355048-396b-4f00-8ff6-1ffff1d9d62c","Type":"ContainerStarted","Data":"2e1fead50a7ebf7adff0fbed317a8fd55c96e153d6ee602449ce84b0372d4a3e"} Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.585050 4886 generic.go:334] "Generic (PLEG): container finished" podID="d7ad8d4f-d958-43b7-b84d-c8672642d21b" containerID="39570c129f1650fa07b216c5e683d1c44ac48e12801ca1f49ef7eb2a7fe8a733" exitCode=0 Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.585357 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" event={"ID":"d7ad8d4f-d958-43b7-b84d-c8672642d21b","Type":"ContainerDied","Data":"39570c129f1650fa07b216c5e683d1c44ac48e12801ca1f49ef7eb2a7fe8a733"} Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.639166 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.644492 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8lqwb" Mar 14 08:31:32 crc kubenswrapper[4886]: I0314 08:31:32.848325 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.201000 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.367705 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3e06c71-44e4-476f-9f6c-96737f410dbf-kube-api-access\") pod \"a3e06c71-44e4-476f-9f6c-96737f410dbf\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.367762 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e06c71-44e4-476f-9f6c-96737f410dbf-kubelet-dir\") pod \"a3e06c71-44e4-476f-9f6c-96737f410dbf\" (UID: \"a3e06c71-44e4-476f-9f6c-96737f410dbf\") " Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.368220 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3e06c71-44e4-476f-9f6c-96737f410dbf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a3e06c71-44e4-476f-9f6c-96737f410dbf" (UID: "a3e06c71-44e4-476f-9f6c-96737f410dbf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.375504 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e06c71-44e4-476f-9f6c-96737f410dbf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a3e06c71-44e4-476f-9f6c-96737f410dbf" (UID: "a3e06c71-44e4-476f-9f6c-96737f410dbf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.471008 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3e06c71-44e4-476f-9f6c-96737f410dbf-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.471049 4886 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3e06c71-44e4-476f-9f6c-96737f410dbf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.666241 4886 generic.go:334] "Generic (PLEG): container finished" podID="28585f97-68cd-440f-acb0-0e5bd9117023" containerID="f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db" exitCode=0 Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.666363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerDied","Data":"f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db"} Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.671731 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerStarted","Data":"633863570eec337ae40f5bb4bb4c03c3d41e5715d6b0e6fecc96bbb91de1274e"} Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.671807 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ebf6a823-6c53-4134-ad40-a3289e6b28f3","Type":"ContainerStarted","Data":"f147908ec837849aaa6836c925fe78c120e40d8843633b2f3995dcbddfc12916"} Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.676098 4886 generic.go:334] "Generic (PLEG): container finished" podID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerID="cdf18872eed6ea8a85d4637daa0b53df7bd9e584c31e7752fb38917a73610bae" exitCode=0 Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.676168 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfbq" event={"ID":"7c355048-396b-4f00-8ff6-1ffff1d9d62c","Type":"ContainerDied","Data":"cdf18872eed6ea8a85d4637daa0b53df7bd9e584c31e7752fb38917a73610bae"} Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.684481 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.685053 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a3e06c71-44e4-476f-9f6c-96737f410dbf","Type":"ContainerDied","Data":"2f4e78259e3278f7c47722ca9b5164c86d0c019dae0f662b62adc3a7c1fb36f1"} Mar 14 08:31:33 crc kubenswrapper[4886]: I0314 08:31:33.685081 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4e78259e3278f7c47722ca9b5164c86d0c019dae0f662b62adc3a7c1fb36f1" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.168469 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.288441 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnbgs\" (UniqueName: \"kubernetes.io/projected/d7ad8d4f-d958-43b7-b84d-c8672642d21b-kube-api-access-lnbgs\") pod \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.288555 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7ad8d4f-d958-43b7-b84d-c8672642d21b-config-volume\") pod \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.288657 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7ad8d4f-d958-43b7-b84d-c8672642d21b-secret-volume\") pod \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\" (UID: \"d7ad8d4f-d958-43b7-b84d-c8672642d21b\") " Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.290678 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7ad8d4f-d958-43b7-b84d-c8672642d21b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7ad8d4f-d958-43b7-b84d-c8672642d21b" (UID: "d7ad8d4f-d958-43b7-b84d-c8672642d21b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.295484 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7ad8d4f-d958-43b7-b84d-c8672642d21b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7ad8d4f-d958-43b7-b84d-c8672642d21b" (UID: "d7ad8d4f-d958-43b7-b84d-c8672642d21b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.296710 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7ad8d4f-d958-43b7-b84d-c8672642d21b-kube-api-access-lnbgs" (OuterVolumeSpecName: "kube-api-access-lnbgs") pod "d7ad8d4f-d958-43b7-b84d-c8672642d21b" (UID: "d7ad8d4f-d958-43b7-b84d-c8672642d21b"). InnerVolumeSpecName "kube-api-access-lnbgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.402255 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7ad8d4f-d958-43b7-b84d-c8672642d21b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.402327 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnbgs\" (UniqueName: \"kubernetes.io/projected/d7ad8d4f-d958-43b7-b84d-c8672642d21b-kube-api-access-lnbgs\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.402341 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7ad8d4f-d958-43b7-b84d-c8672642d21b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.716134 4886 ???:1] "http: TLS handshake error from 192.168.126.11:34664: no serving certificate available for the kubelet" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.744499 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" event={"ID":"d7ad8d4f-d958-43b7-b84d-c8672642d21b","Type":"ContainerDied","Data":"c2d7e6b22f698da72d700619883b349767df8c34f2fb3dac0b8cc69d71ee2c99"} Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.744560 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.744564 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d7e6b22f698da72d700619883b349767df8c34f2fb3dac0b8cc69d71ee2c99" Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.755525 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ebf6a823-6c53-4134-ad40-a3289e6b28f3","Type":"ContainerStarted","Data":"de5f6fce326d2821ca16f99d8542ed1fc5b4b6778e1003fcce0e168e753eceb9"} Mar 14 08:31:34 crc kubenswrapper[4886]: I0314 08:31:34.770599 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.7705745459999997 podStartE2EDuration="3.770574546s" podCreationTimestamp="2026-03-14 08:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:31:34.766709615 +0000 UTC m=+230.015161252" watchObservedRunningTime="2026-03-14 08:31:34.770574546 +0000 UTC m=+230.019026183" Mar 14 08:31:35 crc kubenswrapper[4886]: I0314 08:31:35.778582 4886 generic.go:334] "Generic (PLEG): container finished" podID="ebf6a823-6c53-4134-ad40-a3289e6b28f3" containerID="de5f6fce326d2821ca16f99d8542ed1fc5b4b6778e1003fcce0e168e753eceb9" exitCode=0 Mar 14 08:31:35 crc kubenswrapper[4886]: I0314 08:31:35.778698 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ebf6a823-6c53-4134-ad40-a3289e6b28f3","Type":"ContainerDied","Data":"de5f6fce326d2821ca16f99d8542ed1fc5b4b6778e1003fcce0e168e753eceb9"} Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.067695 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.194819 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kubelet-dir\") pod \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.194929 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kube-api-access\") pod \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\" (UID: \"ebf6a823-6c53-4134-ad40-a3289e6b28f3\") " Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.197270 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ebf6a823-6c53-4134-ad40-a3289e6b28f3" (UID: "ebf6a823-6c53-4134-ad40-a3289e6b28f3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.221749 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ebf6a823-6c53-4134-ad40-a3289e6b28f3" (UID: "ebf6a823-6c53-4134-ad40-a3289e6b28f3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.297065 4886 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.297140 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf6a823-6c53-4134-ad40-a3289e6b28f3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.406727 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c9qzb" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.799342 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ebf6a823-6c53-4134-ad40-a3289e6b28f3","Type":"ContainerDied","Data":"f147908ec837849aaa6836c925fe78c120e40d8843633b2f3995dcbddfc12916"} Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.799421 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f147908ec837849aaa6836c925fe78c120e40d8843633b2f3995dcbddfc12916" Mar 14 08:31:37 crc kubenswrapper[4886]: I0314 08:31:37.799419 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 14 08:31:41 crc kubenswrapper[4886]: I0314 08:31:41.289776 4886 patch_prober.go:28] interesting pod/console-f9d7485db-wmcc2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 14 08:31:41 crc kubenswrapper[4886]: I0314 08:31:41.290220 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wmcc2" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 14 08:31:41 crc kubenswrapper[4886]: I0314 08:31:41.645150 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:41 crc kubenswrapper[4886]: I0314 08:31:41.645204 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:41 crc kubenswrapper[4886]: I0314 08:31:41.645217 4886 patch_prober.go:28] interesting pod/downloads-7954f5f757-hznhw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 14 08:31:41 crc kubenswrapper[4886]: I0314 08:31:41.645270 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hznhw" podUID="feeab5ae-f3ec-4590-8625-00e98fb5064b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 14 08:31:46 crc kubenswrapper[4886]: I0314 08:31:46.984182 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:46 crc kubenswrapper[4886]: I0314 08:31:46.986017 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.001321 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/842ea68a-b5ee-4b60-8e98-26e2ff72ae3b-metrics-certs\") pod \"network-metrics-daemon-hq6j4\" (UID: \"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b\") " pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.198792 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.209281 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hq6j4" Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.215211 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fc8bf5689-szgl2"] Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.215499 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" podUID="bd789bc6-70ed-439a-a3a6-83984147f97b" containerName="controller-manager" containerID="cri-o://d8960ea2c1991ffdcc473f6be2459de8ae762563424a6cbea3338eec69eb202e" gracePeriod=30 Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.221266 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz"] Mar 14 08:31:47 crc kubenswrapper[4886]: I0314 08:31:47.221729 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" podUID="845227fa-82cd-4957-92a9-164d2f2bba1d" containerName="route-controller-manager" containerID="cri-o://c8349ddd9de63e74f232fc9a4ada283cff013e2f085119ee7c6eb129950a3aac" gracePeriod=30 Mar 14 08:31:48 crc kubenswrapper[4886]: I0314 08:31:48.926033 4886 generic.go:334] "Generic (PLEG): container finished" podID="845227fa-82cd-4957-92a9-164d2f2bba1d" containerID="c8349ddd9de63e74f232fc9a4ada283cff013e2f085119ee7c6eb129950a3aac" exitCode=0 Mar 14 08:31:48 crc kubenswrapper[4886]: I0314 08:31:48.926173 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" event={"ID":"845227fa-82cd-4957-92a9-164d2f2bba1d","Type":"ContainerDied","Data":"c8349ddd9de63e74f232fc9a4ada283cff013e2f085119ee7c6eb129950a3aac"} Mar 14 08:31:48 crc kubenswrapper[4886]: I0314 08:31:48.928734 4886 generic.go:334] "Generic (PLEG): container finished" podID="bd789bc6-70ed-439a-a3a6-83984147f97b" containerID="d8960ea2c1991ffdcc473f6be2459de8ae762563424a6cbea3338eec69eb202e" exitCode=0 Mar 14 08:31:48 crc kubenswrapper[4886]: I0314 08:31:48.928780 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" event={"ID":"bd789bc6-70ed-439a-a3a6-83984147f97b","Type":"ContainerDied","Data":"d8960ea2c1991ffdcc473f6be2459de8ae762563424a6cbea3338eec69eb202e"} Mar 14 08:31:50 crc kubenswrapper[4886]: I0314 08:31:50.070858 4886 patch_prober.go:28] interesting pod/controller-manager-6fc8bf5689-szgl2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 14 08:31:50 crc kubenswrapper[4886]: I0314 08:31:50.070919 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" podUID="bd789bc6-70ed-439a-a3a6-83984147f97b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 14 08:31:50 crc kubenswrapper[4886]: I0314 08:31:50.070959 4886 patch_prober.go:28] interesting pod/route-controller-manager-5756dc695d-28vwz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 14 08:31:50 crc kubenswrapper[4886]: I0314 08:31:50.071013 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" podUID="845227fa-82cd-4957-92a9-164d2f2bba1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 14 08:31:50 crc kubenswrapper[4886]: I0314 08:31:50.103295 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:31:51 crc kubenswrapper[4886]: I0314 08:31:51.295292 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:51 crc kubenswrapper[4886]: I0314 08:31:51.301639 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:31:51 crc kubenswrapper[4886]: I0314 08:31:51.650258 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hznhw" Mar 14 08:31:55 crc kubenswrapper[4886]: I0314 08:31:55.229992 4886 ???:1] "http: TLS handshake error from 192.168.126.11:51052: no serving certificate available for the kubelet" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.066787 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.067373 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.251537 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.252274 4886 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 14 08:31:56 crc kubenswrapper[4886]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 14 08:31:56 crc kubenswrapper[4886]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wfztb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29557950-fd7np_openshift-infra(266a4a60-8fb5-4685-b4ac-621f93829611): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 14 08:31:56 crc kubenswrapper[4886]: > logger="UnhandledError" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.253471 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29557950-fd7np" podUID="266a4a60-8fb5-4685-b4ac-621f93829611" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.593608 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617204 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5"] Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.617405 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3e06c71-44e4-476f-9f6c-96737f410dbf" containerName="pruner" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617418 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3e06c71-44e4-476f-9f6c-96737f410dbf" containerName="pruner" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.617427 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf6a823-6c53-4134-ad40-a3289e6b28f3" containerName="pruner" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617435 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf6a823-6c53-4134-ad40-a3289e6b28f3" containerName="pruner" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.617512 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845227fa-82cd-4957-92a9-164d2f2bba1d" containerName="route-controller-manager" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617519 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="845227fa-82cd-4957-92a9-164d2f2bba1d" containerName="route-controller-manager" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.617527 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7ad8d4f-d958-43b7-b84d-c8672642d21b" containerName="collect-profiles" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617534 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7ad8d4f-d958-43b7-b84d-c8672642d21b" containerName="collect-profiles" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617635 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7ad8d4f-d958-43b7-b84d-c8672642d21b" containerName="collect-profiles" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617648 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf6a823-6c53-4134-ad40-a3289e6b28f3" containerName="pruner" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617658 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3e06c71-44e4-476f-9f6c-96737f410dbf" containerName="pruner" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.617668 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="845227fa-82cd-4957-92a9-164d2f2bba1d" containerName="route-controller-manager" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.618052 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.642439 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5"] Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.672294 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhg49\" (UniqueName: \"kubernetes.io/projected/9d8fa830-c356-44bc-96a6-dc8bc91beb83-kube-api-access-vhg49\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.672718 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-client-ca\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.672765 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8fa830-c356-44bc-96a6-dc8bc91beb83-serving-cert\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.672794 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-config\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.773653 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-client-ca\") pod \"845227fa-82cd-4957-92a9-164d2f2bba1d\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.773753 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjs5f\" (UniqueName: \"kubernetes.io/projected/845227fa-82cd-4957-92a9-164d2f2bba1d-kube-api-access-wjs5f\") pod \"845227fa-82cd-4957-92a9-164d2f2bba1d\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.773793 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-config\") pod \"845227fa-82cd-4957-92a9-164d2f2bba1d\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.773828 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/845227fa-82cd-4957-92a9-164d2f2bba1d-serving-cert\") pod \"845227fa-82cd-4957-92a9-164d2f2bba1d\" (UID: \"845227fa-82cd-4957-92a9-164d2f2bba1d\") " Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.774002 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-config\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.774062 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhg49\" (UniqueName: \"kubernetes.io/projected/9d8fa830-c356-44bc-96a6-dc8bc91beb83-kube-api-access-vhg49\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.774104 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-client-ca\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.774154 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8fa830-c356-44bc-96a6-dc8bc91beb83-serving-cert\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.775869 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-client-ca\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.775995 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-config\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.776349 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-client-ca" (OuterVolumeSpecName: "client-ca") pod "845227fa-82cd-4957-92a9-164d2f2bba1d" (UID: "845227fa-82cd-4957-92a9-164d2f2bba1d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.776476 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-config" (OuterVolumeSpecName: "config") pod "845227fa-82cd-4957-92a9-164d2f2bba1d" (UID: "845227fa-82cd-4957-92a9-164d2f2bba1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.781472 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8fa830-c356-44bc-96a6-dc8bc91beb83-serving-cert\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.781595 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845227fa-82cd-4957-92a9-164d2f2bba1d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "845227fa-82cd-4957-92a9-164d2f2bba1d" (UID: "845227fa-82cd-4957-92a9-164d2f2bba1d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.790666 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845227fa-82cd-4957-92a9-164d2f2bba1d-kube-api-access-wjs5f" (OuterVolumeSpecName: "kube-api-access-wjs5f") pod "845227fa-82cd-4957-92a9-164d2f2bba1d" (UID: "845227fa-82cd-4957-92a9-164d2f2bba1d"). InnerVolumeSpecName "kube-api-access-wjs5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.792839 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhg49\" (UniqueName: \"kubernetes.io/projected/9d8fa830-c356-44bc-96a6-dc8bc91beb83-kube-api-access-vhg49\") pod \"route-controller-manager-75f7dfc654-mkzk5\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.876213 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.876254 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjs5f\" (UniqueName: \"kubernetes.io/projected/845227fa-82cd-4957-92a9-164d2f2bba1d-kube-api-access-wjs5f\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.876265 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/845227fa-82cd-4957-92a9-164d2f2bba1d-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.876274 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/845227fa-82cd-4957-92a9-164d2f2bba1d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.945114 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.977772 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.977774 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz" event={"ID":"845227fa-82cd-4957-92a9-164d2f2bba1d","Type":"ContainerDied","Data":"36e02ff849abbf46660ff74eff69d58456cf87d67447007be53b2877e5292e73"} Mar 14 08:31:56 crc kubenswrapper[4886]: I0314 08:31:56.977874 4886 scope.go:117] "RemoveContainer" containerID="c8349ddd9de63e74f232fc9a4ada283cff013e2f085119ee7c6eb129950a3aac" Mar 14 08:31:56 crc kubenswrapper[4886]: E0314 08:31:56.979192 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29557950-fd7np" podUID="266a4a60-8fb5-4685-b4ac-621f93829611" Mar 14 08:31:57 crc kubenswrapper[4886]: I0314 08:31:57.010909 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz"] Mar 14 08:31:57 crc kubenswrapper[4886]: I0314 08:31:57.013716 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5756dc695d-28vwz"] Mar 14 08:31:57 crc kubenswrapper[4886]: I0314 08:31:57.429324 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845227fa-82cd-4957-92a9-164d2f2bba1d" path="/var/lib/kubelet/pods/845227fa-82cd-4957-92a9-164d2f2bba1d/volumes" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.552190 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.604331 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-585cf757df-tlzns"] Mar 14 08:31:59 crc kubenswrapper[4886]: E0314 08:31:59.604658 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd789bc6-70ed-439a-a3a6-83984147f97b" containerName="controller-manager" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.604673 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd789bc6-70ed-439a-a3a6-83984147f97b" containerName="controller-manager" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.604794 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd789bc6-70ed-439a-a3a6-83984147f97b" containerName="controller-manager" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.605275 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612617 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-config\") pod \"bd789bc6-70ed-439a-a3a6-83984147f97b\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612680 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-proxy-ca-bundles\") pod \"bd789bc6-70ed-439a-a3a6-83984147f97b\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612706 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-client-ca\") pod \"bd789bc6-70ed-439a-a3a6-83984147f97b\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612735 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6fvl\" (UniqueName: \"kubernetes.io/projected/bd789bc6-70ed-439a-a3a6-83984147f97b-kube-api-access-b6fvl\") pod \"bd789bc6-70ed-439a-a3a6-83984147f97b\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612781 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd789bc6-70ed-439a-a3a6-83984147f97b-serving-cert\") pod \"bd789bc6-70ed-439a-a3a6-83984147f97b\" (UID: \"bd789bc6-70ed-439a-a3a6-83984147f97b\") " Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612892 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn2k7\" (UniqueName: \"kubernetes.io/projected/edf87ed3-6709-4269-bed7-a5f916f46f5c-kube-api-access-hn2k7\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612936 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf87ed3-6709-4269-bed7-a5f916f46f5c-serving-cert\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612956 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-config\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.612978 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-client-ca\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.613019 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-proxy-ca-bundles\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.614412 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-config" (OuterVolumeSpecName: "config") pod "bd789bc6-70ed-439a-a3a6-83984147f97b" (UID: "bd789bc6-70ed-439a-a3a6-83984147f97b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.616140 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585cf757df-tlzns"] Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.616464 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd789bc6-70ed-439a-a3a6-83984147f97b" (UID: "bd789bc6-70ed-439a-a3a6-83984147f97b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.616645 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bd789bc6-70ed-439a-a3a6-83984147f97b" (UID: "bd789bc6-70ed-439a-a3a6-83984147f97b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.631861 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd789bc6-70ed-439a-a3a6-83984147f97b-kube-api-access-b6fvl" (OuterVolumeSpecName: "kube-api-access-b6fvl") pod "bd789bc6-70ed-439a-a3a6-83984147f97b" (UID: "bd789bc6-70ed-439a-a3a6-83984147f97b"). InnerVolumeSpecName "kube-api-access-b6fvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.631988 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd789bc6-70ed-439a-a3a6-83984147f97b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd789bc6-70ed-439a-a3a6-83984147f97b" (UID: "bd789bc6-70ed-439a-a3a6-83984147f97b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714347 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn2k7\" (UniqueName: \"kubernetes.io/projected/edf87ed3-6709-4269-bed7-a5f916f46f5c-kube-api-access-hn2k7\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714507 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf87ed3-6709-4269-bed7-a5f916f46f5c-serving-cert\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714539 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-config\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714572 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-client-ca\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714628 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-proxy-ca-bundles\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714710 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6fvl\" (UniqueName: \"kubernetes.io/projected/bd789bc6-70ed-439a-a3a6-83984147f97b-kube-api-access-b6fvl\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714728 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd789bc6-70ed-439a-a3a6-83984147f97b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714743 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714756 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.714769 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd789bc6-70ed-439a-a3a6-83984147f97b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.716170 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-config\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.717168 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-client-ca\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.717296 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-proxy-ca-bundles\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.719272 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf87ed3-6709-4269-bed7-a5f916f46f5c-serving-cert\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.730059 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn2k7\" (UniqueName: \"kubernetes.io/projected/edf87ed3-6709-4269-bed7-a5f916f46f5c-kube-api-access-hn2k7\") pod \"controller-manager-585cf757df-tlzns\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:31:59 crc kubenswrapper[4886]: I0314 08:31:59.934510 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.004461 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" event={"ID":"bd789bc6-70ed-439a-a3a6-83984147f97b","Type":"ContainerDied","Data":"1f43f1a3c5fc8606ab5d1674447b39ec66c21b3571432d37bd7531ea3dd78b68"} Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.004616 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fc8bf5689-szgl2" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.042330 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fc8bf5689-szgl2"] Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.045821 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6fc8bf5689-szgl2"] Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.136610 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557952-fzb8n"] Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.137948 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.140122 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.141644 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-fzb8n"] Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.324563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58jp\" (UniqueName: \"kubernetes.io/projected/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9-kube-api-access-z58jp\") pod \"auto-csr-approver-29557952-fzb8n\" (UID: \"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9\") " pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.425918 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58jp\" (UniqueName: \"kubernetes.io/projected/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9-kube-api-access-z58jp\") pod \"auto-csr-approver-29557952-fzb8n\" (UID: \"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9\") " pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.444357 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58jp\" (UniqueName: \"kubernetes.io/projected/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9-kube-api-access-z58jp\") pod \"auto-csr-approver-29557952-fzb8n\" (UID: \"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9\") " pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:00 crc kubenswrapper[4886]: I0314 08:32:00.460636 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.432632 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd789bc6-70ed-439a-a3a6-83984147f97b" path="/var/lib/kubelet/pods/bd789bc6-70ed-439a-a3a6-83984147f97b/volumes" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.699611 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.700478 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.702688 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.702841 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.714718 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.743789 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/551df435-0815-4f88-966b-3e60ef48aaa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.744105 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/551df435-0815-4f88-966b-3e60ef48aaa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.844855 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/551df435-0815-4f88-966b-3e60ef48aaa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.844899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/551df435-0815-4f88-966b-3e60ef48aaa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.844972 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/551df435-0815-4f88-966b-3e60ef48aaa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:01 crc kubenswrapper[4886]: I0314 08:32:01.877589 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/551df435-0815-4f88-966b-3e60ef48aaa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:02 crc kubenswrapper[4886]: I0314 08:32:02.030053 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:02 crc kubenswrapper[4886]: I0314 08:32:02.033544 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhw5h" Mar 14 08:32:05 crc kubenswrapper[4886]: I0314 08:32:05.903581 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.104098 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.104735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.127376 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.230185 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-var-lock\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.230334 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.230424 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c85c6296-9423-4f0a-b5cd-d265dd437a77-kube-api-access\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.332060 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-var-lock\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.332315 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.332388 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c85c6296-9423-4f0a-b5cd-d265dd437a77-kube-api-access\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.332574 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-var-lock\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.332652 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.363607 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c85c6296-9423-4f0a-b5cd-d265dd437a77-kube-api-access\") pod \"installer-9-crc\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: I0314 08:32:06.442808 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:06 crc kubenswrapper[4886]: E0314 08:32:06.444468 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 08:32:06 crc kubenswrapper[4886]: E0314 08:32:06.444644 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j99w9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dk6zm_openshift-marketplace(333059fe-3e95-4e08-b70e-d7d95e1ed279): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 08:32:06 crc kubenswrapper[4886]: E0314 08:32:06.446088 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dk6zm" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" Mar 14 08:32:07 crc kubenswrapper[4886]: I0314 08:32:07.219162 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585cf757df-tlzns"] Mar 14 08:32:07 crc kubenswrapper[4886]: I0314 08:32:07.321899 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5"] Mar 14 08:32:07 crc kubenswrapper[4886]: E0314 08:32:07.956426 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dk6zm" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" Mar 14 08:32:07 crc kubenswrapper[4886]: I0314 08:32:07.968048 4886 scope.go:117] "RemoveContainer" containerID="d8960ea2c1991ffdcc473f6be2459de8ae762563424a6cbea3338eec69eb202e" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.035267 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.035984 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwbwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vnjw4_openshift-marketplace(9306248c-2771-4cb6-bdd4-f8628c2b6428): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.037816 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vnjw4" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.066423 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vnjw4" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.144454 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.144671 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rb7d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fpr95_openshift-marketplace(98f142ad-f9c5-41ee-81ec-632938796964): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.145910 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fpr95" podUID="98f142ad-f9c5-41ee-81ec-632938796964" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.149945 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.150198 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gspdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-snhqq_openshift-marketplace(758a60d0-6132-4b23-8062-febd479f7fff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.151927 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-snhqq" podUID="758a60d0-6132-4b23-8062-febd479f7fff" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.185777 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.185956 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n7fk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t4gbt_openshift-marketplace(837659fc-08c6-4ea6-8799-aa4297b20689): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.187546 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t4gbt" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" Mar 14 08:32:08 crc kubenswrapper[4886]: I0314 08:32:08.272901 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5"] Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.292637 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.293415 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlzjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6gfbq_openshift-marketplace(7c355048-396b-4f00-8ff6-1ffff1d9d62c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 14 08:32:08 crc kubenswrapper[4886]: E0314 08:32:08.296561 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6gfbq" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" Mar 14 08:32:08 crc kubenswrapper[4886]: I0314 08:32:08.397560 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hq6j4"] Mar 14 08:32:08 crc kubenswrapper[4886]: I0314 08:32:08.599994 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 14 08:32:08 crc kubenswrapper[4886]: W0314 08:32:08.608100 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod551df435_0815_4f88_966b_3e60ef48aaa4.slice/crio-0e0beae7e304152d1345200a6752f87e89d44f1c223e9935daddd654db7579d5 WatchSource:0}: Error finding container 0e0beae7e304152d1345200a6752f87e89d44f1c223e9935daddd654db7579d5: Status 404 returned error can't find the container with id 0e0beae7e304152d1345200a6752f87e89d44f1c223e9935daddd654db7579d5 Mar 14 08:32:08 crc kubenswrapper[4886]: I0314 08:32:08.681906 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-fzb8n"] Mar 14 08:32:08 crc kubenswrapper[4886]: I0314 08:32:08.690622 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585cf757df-tlzns"] Mar 14 08:32:08 crc kubenswrapper[4886]: W0314 08:32:08.692398 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2d7e1cf_cd9a_4bf9_8f6b_1fccc137d1f9.slice/crio-d8749af0942e4f0228c5731a87af2a6f7411ab02ac090fa88d2074a404249139 WatchSource:0}: Error finding container d8749af0942e4f0228c5731a87af2a6f7411ab02ac090fa88d2074a404249139: Status 404 returned error can't find the container with id d8749af0942e4f0228c5731a87af2a6f7411ab02ac090fa88d2074a404249139 Mar 14 08:32:08 crc kubenswrapper[4886]: I0314 08:32:08.692686 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 14 08:32:08 crc kubenswrapper[4886]: W0314 08:32:08.706650 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf87ed3_6709_4269_bed7_a5f916f46f5c.slice/crio-a0a8f074847a2287a7d5fe20efab6b459309d6c0341f29c0eb921c65ae5797bc WatchSource:0}: Error finding container a0a8f074847a2287a7d5fe20efab6b459309d6c0341f29c0eb921c65ae5797bc: Status 404 returned error can't find the container with id a0a8f074847a2287a7d5fe20efab6b459309d6c0341f29c0eb921c65ae5797bc Mar 14 08:32:08 crc kubenswrapper[4886]: W0314 08:32:08.722635 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc85c6296_9423_4f0a_b5cd_d265dd437a77.slice/crio-6b5bab7d264a11a721749c8c104958c691deba16c49471bbf7d769457e1deeea WatchSource:0}: Error finding container 6b5bab7d264a11a721749c8c104958c691deba16c49471bbf7d769457e1deeea: Status 404 returned error can't find the container with id 6b5bab7d264a11a721749c8c104958c691deba16c49471bbf7d769457e1deeea Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.063660 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c85c6296-9423-4f0a-b5cd-d265dd437a77","Type":"ContainerStarted","Data":"6b5bab7d264a11a721749c8c104958c691deba16c49471bbf7d769457e1deeea"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.069603 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" event={"ID":"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b","Type":"ContainerStarted","Data":"f0dcc50f8549a5a1fafb0decc4cfeeec60253c8f0526854835d2000dd86c4981"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.069653 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" event={"ID":"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b","Type":"ContainerStarted","Data":"7314dd8900ccfea3102b8a53294a8d9bb95e4ecc9c93986a39fd7e5fef4926a0"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.072175 4886 generic.go:334] "Generic (PLEG): container finished" podID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerID="cdaaaef5fcef86cbcca5d10d2bef1780d58f63a309a679e99c76b0974dcb888f" exitCode=0 Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.072270 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brtwn" event={"ID":"aeaec6eb-91cb-4e68-807f-994b4e9df360","Type":"ContainerDied","Data":"cdaaaef5fcef86cbcca5d10d2bef1780d58f63a309a679e99c76b0974dcb888f"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.078217 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"551df435-0815-4f88-966b-3e60ef48aaa4","Type":"ContainerStarted","Data":"0e0beae7e304152d1345200a6752f87e89d44f1c223e9935daddd654db7579d5"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.079521 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" event={"ID":"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9","Type":"ContainerStarted","Data":"d8749af0942e4f0228c5731a87af2a6f7411ab02ac090fa88d2074a404249139"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.085384 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerStarted","Data":"fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.088948 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" event={"ID":"9d8fa830-c356-44bc-96a6-dc8bc91beb83","Type":"ContainerStarted","Data":"91db4c5db896d5d251e855e38cd92bd06ffbb91e495f2340eb566f133910aeed"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.088981 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" event={"ID":"9d8fa830-c356-44bc-96a6-dc8bc91beb83","Type":"ContainerStarted","Data":"cf00021cb823fa03be557a85e5b2ecaf3b9495aabfd6675860e8bf741a93d380"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.089100 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" podUID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" containerName="route-controller-manager" containerID="cri-o://91db4c5db896d5d251e855e38cd92bd06ffbb91e495f2340eb566f133910aeed" gracePeriod=30 Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.089386 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.092828 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" event={"ID":"edf87ed3-6709-4269-bed7-a5f916f46f5c","Type":"ContainerStarted","Data":"ee4da1d0123e2e993efd5e9d103b57de872c1411580c838c2cd361e78267f00b"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.093072 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" event={"ID":"edf87ed3-6709-4269-bed7-a5f916f46f5c","Type":"ContainerStarted","Data":"a0a8f074847a2287a7d5fe20efab6b459309d6c0341f29c0eb921c65ae5797bc"} Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.092852 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerName="controller-manager" containerID="cri-o://ee4da1d0123e2e993efd5e9d103b57de872c1411580c838c2cd361e78267f00b" gracePeriod=30 Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.093338 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:32:09 crc kubenswrapper[4886]: E0314 08:32:09.094688 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-snhqq" podUID="758a60d0-6132-4b23-8062-febd479f7fff" Mar 14 08:32:09 crc kubenswrapper[4886]: E0314 08:32:09.099726 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6gfbq" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" Mar 14 08:32:09 crc kubenswrapper[4886]: E0314 08:32:09.099837 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fpr95" podUID="98f142ad-f9c5-41ee-81ec-632938796964" Mar 14 08:32:09 crc kubenswrapper[4886]: E0314 08:32:09.100645 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t4gbt" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.104253 4886 patch_prober.go:28] interesting pod/route-controller-manager-75f7dfc654-mkzk5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:49056->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.104353 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" podUID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:49056->10.217.0.57:8443: read: connection reset by peer" Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.106494 4886 patch_prober.go:28] interesting pod/controller-manager-585cf757df-tlzns container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.106557 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 14 08:32:09 crc kubenswrapper[4886]: I0314 08:32:09.187258 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" podStartSLOduration=22.18209324 podStartE2EDuration="22.18209324s" podCreationTimestamp="2026-03-14 08:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:09.178809215 +0000 UTC m=+264.427260852" watchObservedRunningTime="2026-03-14 08:32:09.18209324 +0000 UTC m=+264.430544877" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.104479 4886 generic.go:334] "Generic (PLEG): container finished" podID="28585f97-68cd-440f-acb0-0e5bd9117023" containerID="fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa" exitCode=0 Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.104782 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerDied","Data":"fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.108355 4886 generic.go:334] "Generic (PLEG): container finished" podID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" containerID="91db4c5db896d5d251e855e38cd92bd06ffbb91e495f2340eb566f133910aeed" exitCode=0 Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.108437 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" event={"ID":"9d8fa830-c356-44bc-96a6-dc8bc91beb83","Type":"ContainerDied","Data":"91db4c5db896d5d251e855e38cd92bd06ffbb91e495f2340eb566f133910aeed"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.108471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" event={"ID":"9d8fa830-c356-44bc-96a6-dc8bc91beb83","Type":"ContainerDied","Data":"cf00021cb823fa03be557a85e5b2ecaf3b9495aabfd6675860e8bf741a93d380"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.108482 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf00021cb823fa03be557a85e5b2ecaf3b9495aabfd6675860e8bf741a93d380" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.111957 4886 generic.go:334] "Generic (PLEG): container finished" podID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerID="ee4da1d0123e2e993efd5e9d103b57de872c1411580c838c2cd361e78267f00b" exitCode=0 Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.112027 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" event={"ID":"edf87ed3-6709-4269-bed7-a5f916f46f5c","Type":"ContainerDied","Data":"ee4da1d0123e2e993efd5e9d103b57de872c1411580c838c2cd361e78267f00b"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.112055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" event={"ID":"edf87ed3-6709-4269-bed7-a5f916f46f5c","Type":"ContainerDied","Data":"a0a8f074847a2287a7d5fe20efab6b459309d6c0341f29c0eb921c65ae5797bc"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.112065 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0a8f074847a2287a7d5fe20efab6b459309d6c0341f29c0eb921c65ae5797bc" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.113586 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.115484 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hq6j4" event={"ID":"842ea68a-b5ee-4b60-8e98-26e2ff72ae3b","Type":"ContainerStarted","Data":"8dae1ffb58847caf2d5b1f38ab55d78ec63bb6a6c3ebedb902cd2694456d50c6"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.118003 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.118224 4886 generic.go:334] "Generic (PLEG): container finished" podID="551df435-0815-4f88-966b-3e60ef48aaa4" containerID="313e77ec145e1b84d5530836bfeec2d10c80cfcba38ab8ab00b0a033d665d6f1" exitCode=0 Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.118270 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"551df435-0815-4f88-966b-3e60ef48aaa4","Type":"ContainerDied","Data":"313e77ec145e1b84d5530836bfeec2d10c80cfcba38ab8ab00b0a033d665d6f1"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.119568 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c85c6296-9423-4f0a-b5cd-d265dd437a77","Type":"ContainerStarted","Data":"97792cb76d0c416b5b962919c01bf15eeacc427b9746436e279b6b3adc5eb281"} Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.126235 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" podStartSLOduration=23.126219989 podStartE2EDuration="23.126219989s" podCreationTimestamp="2026-03-14 08:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:09.264719146 +0000 UTC m=+264.513170783" watchObservedRunningTime="2026-03-14 08:32:10.126219989 +0000 UTC m=+265.374671636" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.186671 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv"] Mar 14 08:32:10 crc kubenswrapper[4886]: E0314 08:32:10.186891 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerName="controller-manager" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.186904 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerName="controller-manager" Mar 14 08:32:10 crc kubenswrapper[4886]: E0314 08:32:10.186916 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" containerName="route-controller-manager" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.186922 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" containerName="route-controller-manager" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.187028 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" containerName="route-controller-manager" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.187044 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerName="controller-manager" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.187495 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.215704 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv"] Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228269 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-config\") pod \"edf87ed3-6709-4269-bed7-a5f916f46f5c\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228338 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn2k7\" (UniqueName: \"kubernetes.io/projected/edf87ed3-6709-4269-bed7-a5f916f46f5c-kube-api-access-hn2k7\") pod \"edf87ed3-6709-4269-bed7-a5f916f46f5c\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228366 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8fa830-c356-44bc-96a6-dc8bc91beb83-serving-cert\") pod \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228389 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-client-ca\") pod \"edf87ed3-6709-4269-bed7-a5f916f46f5c\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228445 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf87ed3-6709-4269-bed7-a5f916f46f5c-serving-cert\") pod \"edf87ed3-6709-4269-bed7-a5f916f46f5c\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228480 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-client-ca\") pod \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228544 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-config\") pod \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228560 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-proxy-ca-bundles\") pod \"edf87ed3-6709-4269-bed7-a5f916f46f5c\" (UID: \"edf87ed3-6709-4269-bed7-a5f916f46f5c\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228590 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhg49\" (UniqueName: \"kubernetes.io/projected/9d8fa830-c356-44bc-96a6-dc8bc91beb83-kube-api-access-vhg49\") pod \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\" (UID: \"9d8fa830-c356-44bc-96a6-dc8bc91beb83\") " Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228807 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-config\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228853 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-client-ca\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228878 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a249b150-7d91-4e4b-93eb-7348a46b7dc3-serving-cert\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.228898 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfrj\" (UniqueName: \"kubernetes.io/projected/a249b150-7d91-4e4b-93eb-7348a46b7dc3-kube-api-access-pdfrj\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.230815 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-config" (OuterVolumeSpecName: "config") pod "edf87ed3-6709-4269-bed7-a5f916f46f5c" (UID: "edf87ed3-6709-4269-bed7-a5f916f46f5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.231866 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d8fa830-c356-44bc-96a6-dc8bc91beb83" (UID: "9d8fa830-c356-44bc-96a6-dc8bc91beb83"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.231988 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-config" (OuterVolumeSpecName: "config") pod "9d8fa830-c356-44bc-96a6-dc8bc91beb83" (UID: "9d8fa830-c356-44bc-96a6-dc8bc91beb83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.232407 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-client-ca" (OuterVolumeSpecName: "client-ca") pod "edf87ed3-6709-4269-bed7-a5f916f46f5c" (UID: "edf87ed3-6709-4269-bed7-a5f916f46f5c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.232723 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "edf87ed3-6709-4269-bed7-a5f916f46f5c" (UID: "edf87ed3-6709-4269-bed7-a5f916f46f5c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.236163 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edf87ed3-6709-4269-bed7-a5f916f46f5c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "edf87ed3-6709-4269-bed7-a5f916f46f5c" (UID: "edf87ed3-6709-4269-bed7-a5f916f46f5c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.236872 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8fa830-c356-44bc-96a6-dc8bc91beb83-kube-api-access-vhg49" (OuterVolumeSpecName: "kube-api-access-vhg49") pod "9d8fa830-c356-44bc-96a6-dc8bc91beb83" (UID: "9d8fa830-c356-44bc-96a6-dc8bc91beb83"). InnerVolumeSpecName "kube-api-access-vhg49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.237407 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d8fa830-c356-44bc-96a6-dc8bc91beb83-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d8fa830-c356-44bc-96a6-dc8bc91beb83" (UID: "9d8fa830-c356-44bc-96a6-dc8bc91beb83"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.238716 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf87ed3-6709-4269-bed7-a5f916f46f5c-kube-api-access-hn2k7" (OuterVolumeSpecName: "kube-api-access-hn2k7") pod "edf87ed3-6709-4269-bed7-a5f916f46f5c" (UID: "edf87ed3-6709-4269-bed7-a5f916f46f5c"). InnerVolumeSpecName "kube-api-access-hn2k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.250600 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hq6j4" podStartSLOduration=208.250578749 podStartE2EDuration="3m28.250578749s" podCreationTimestamp="2026-03-14 08:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:10.249619541 +0000 UTC m=+265.498071178" watchObservedRunningTime="2026-03-14 08:32:10.250578749 +0000 UTC m=+265.499030386" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.251256 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.251250838 podStartE2EDuration="4.251250838s" podCreationTimestamp="2026-03-14 08:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:10.232385294 +0000 UTC m=+265.480836931" watchObservedRunningTime="2026-03-14 08:32:10.251250838 +0000 UTC m=+265.499702475" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.329847 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-config\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.329904 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-client-ca\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.329926 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a249b150-7d91-4e4b-93eb-7348a46b7dc3-serving-cert\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.329943 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfrj\" (UniqueName: \"kubernetes.io/projected/a249b150-7d91-4e4b-93eb-7348a46b7dc3-kube-api-access-pdfrj\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330010 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330022 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330035 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhg49\" (UniqueName: \"kubernetes.io/projected/9d8fa830-c356-44bc-96a6-dc8bc91beb83-kube-api-access-vhg49\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330046 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330058 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn2k7\" (UniqueName: \"kubernetes.io/projected/edf87ed3-6709-4269-bed7-a5f916f46f5c-kube-api-access-hn2k7\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330066 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d8fa830-c356-44bc-96a6-dc8bc91beb83-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330075 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf87ed3-6709-4269-bed7-a5f916f46f5c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330083 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf87ed3-6709-4269-bed7-a5f916f46f5c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.330091 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d8fa830-c356-44bc-96a6-dc8bc91beb83-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.331649 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-config\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.332217 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-client-ca\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.336386 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a249b150-7d91-4e4b-93eb-7348a46b7dc3-serving-cert\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.351287 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfrj\" (UniqueName: \"kubernetes.io/projected/a249b150-7d91-4e4b-93eb-7348a46b7dc3-kube-api-access-pdfrj\") pod \"route-controller-manager-7969dd5697-h7bdv\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.515922 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.938350 4886 patch_prober.go:28] interesting pod/controller-manager-585cf757df-tlzns container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" start-of-body= Mar 14 08:32:10 crc kubenswrapper[4886]: I0314 08:32:10.938459 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.124928 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585cf757df-tlzns" Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.124992 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5" Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.168327 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5"] Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.171428 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f7dfc654-mkzk5"] Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.181168 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585cf757df-tlzns"] Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.183864 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-585cf757df-tlzns"] Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.429276 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8fa830-c356-44bc-96a6-dc8bc91beb83" path="/var/lib/kubelet/pods/9d8fa830-c356-44bc-96a6-dc8bc91beb83/volumes" Mar 14 08:32:11 crc kubenswrapper[4886]: I0314 08:32:11.430432 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf87ed3-6709-4269-bed7-a5f916f46f5c" path="/var/lib/kubelet/pods/edf87ed3-6709-4269-bed7-a5f916f46f5c/volumes" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.745637 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-544778d49f-75x52"] Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.746651 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.756195 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.756328 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.757084 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.758155 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.759175 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.759612 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-544778d49f-75x52"] Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.759842 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.796560 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.879629 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.882941 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/551df435-0815-4f88-966b-3e60ef48aaa4-kube-api-access\") pod \"551df435-0815-4f88-966b-3e60ef48aaa4\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883021 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/551df435-0815-4f88-966b-3e60ef48aaa4-kubelet-dir\") pod \"551df435-0815-4f88-966b-3e60ef48aaa4\" (UID: \"551df435-0815-4f88-966b-3e60ef48aaa4\") " Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883242 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/551df435-0815-4f88-966b-3e60ef48aaa4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "551df435-0815-4f88-966b-3e60ef48aaa4" (UID: "551df435-0815-4f88-966b-3e60ef48aaa4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvmrf\" (UniqueName: \"kubernetes.io/projected/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-kube-api-access-vvmrf\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883373 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-serving-cert\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-config\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883424 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-proxy-ca-bundles\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883507 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-client-ca\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.883568 4886 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/551df435-0815-4f88-966b-3e60ef48aaa4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.889280 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551df435-0815-4f88-966b-3e60ef48aaa4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "551df435-0815-4f88-966b-3e60ef48aaa4" (UID: "551df435-0815-4f88-966b-3e60ef48aaa4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.984274 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvmrf\" (UniqueName: \"kubernetes.io/projected/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-kube-api-access-vvmrf\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.984344 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-serving-cert\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.984386 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-config\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.984407 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-proxy-ca-bundles\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.984453 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-client-ca\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.984560 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/551df435-0815-4f88-966b-3e60ef48aaa4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.985778 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-client-ca\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.986760 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-proxy-ca-bundles\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.987079 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-config\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:12 crc kubenswrapper[4886]: I0314 08:32:12.991452 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-serving-cert\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.007025 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvmrf\" (UniqueName: \"kubernetes.io/projected/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-kube-api-access-vvmrf\") pod \"controller-manager-544778d49f-75x52\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.098037 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.139807 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"551df435-0815-4f88-966b-3e60ef48aaa4","Type":"ContainerDied","Data":"0e0beae7e304152d1345200a6752f87e89d44f1c223e9935daddd654db7579d5"} Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.139865 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.139893 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0beae7e304152d1345200a6752f87e89d44f1c223e9935daddd654db7579d5" Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.376304 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv"] Mar 14 08:32:13 crc kubenswrapper[4886]: W0314 08:32:13.396315 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda249b150_7d91_4e4b_93eb_7348a46b7dc3.slice/crio-7ab0f1d8a418e8ff8b03f0e0c5273a722168aeac22b9bfa51196fa52826a4331 WatchSource:0}: Error finding container 7ab0f1d8a418e8ff8b03f0e0c5273a722168aeac22b9bfa51196fa52826a4331: Status 404 returned error can't find the container with id 7ab0f1d8a418e8ff8b03f0e0c5273a722168aeac22b9bfa51196fa52826a4331 Mar 14 08:32:13 crc kubenswrapper[4886]: I0314 08:32:13.499973 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-544778d49f-75x52"] Mar 14 08:32:13 crc kubenswrapper[4886]: W0314 08:32:13.516431 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52cd2979_fdd2_4dea_b6a6_f76883c29e6f.slice/crio-e432344a62b2df737ae68cecf246d8dea34103991930e35fc985d11500b79323 WatchSource:0}: Error finding container e432344a62b2df737ae68cecf246d8dea34103991930e35fc985d11500b79323: Status 404 returned error can't find the container with id e432344a62b2df737ae68cecf246d8dea34103991930e35fc985d11500b79323 Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.148964 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" event={"ID":"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9","Type":"ContainerStarted","Data":"4e58c47bdef3c8efcdad128728f90a3a67cbe7a1ed838239d371cf9fcf1740b6"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.151611 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerStarted","Data":"3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.153010 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" event={"ID":"52cd2979-fdd2-4dea-b6a6-f76883c29e6f","Type":"ContainerStarted","Data":"9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.153054 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" event={"ID":"52cd2979-fdd2-4dea-b6a6-f76883c29e6f","Type":"ContainerStarted","Data":"e432344a62b2df737ae68cecf246d8dea34103991930e35fc985d11500b79323"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.153209 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.155202 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-fd7np" event={"ID":"266a4a60-8fb5-4685-b4ac-621f93829611","Type":"ContainerStarted","Data":"fc2c9d407431805b749a61dec6c02dc9adbdbaeca0ebcaca31a7ba39a5980b81"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.156275 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" event={"ID":"a249b150-7d91-4e4b-93eb-7348a46b7dc3","Type":"ContainerStarted","Data":"ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.156304 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" event={"ID":"a249b150-7d91-4e4b-93eb-7348a46b7dc3","Type":"ContainerStarted","Data":"7ab0f1d8a418e8ff8b03f0e0c5273a722168aeac22b9bfa51196fa52826a4331"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.156507 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.158452 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brtwn" event={"ID":"aeaec6eb-91cb-4e68-807f-994b4e9df360","Type":"ContainerStarted","Data":"341885f5e4d6c9bab49cc8788b107f1e04e61ba85e9472c750fb436d18d008df"} Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.159502 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.226152 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-brtwn" podStartSLOduration=3.308814662 podStartE2EDuration="46.226112876s" podCreationTimestamp="2026-03-14 08:31:28 +0000 UTC" firstStartedPulling="2026-03-14 08:31:30.428706363 +0000 UTC m=+225.677158000" lastFinishedPulling="2026-03-14 08:32:13.346004577 +0000 UTC m=+268.594456214" observedRunningTime="2026-03-14 08:32:14.216502879 +0000 UTC m=+269.464954516" watchObservedRunningTime="2026-03-14 08:32:14.226112876 +0000 UTC m=+269.474564513" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.226538 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" podStartSLOduration=9.69509626 podStartE2EDuration="14.226533748s" podCreationTimestamp="2026-03-14 08:32:00 +0000 UTC" firstStartedPulling="2026-03-14 08:32:08.707504548 +0000 UTC m=+263.955956185" lastFinishedPulling="2026-03-14 08:32:13.238942036 +0000 UTC m=+268.487393673" observedRunningTime="2026-03-14 08:32:14.187636175 +0000 UTC m=+269.436087822" watchObservedRunningTime="2026-03-14 08:32:14.226533748 +0000 UTC m=+269.474985405" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.256940 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557950-fd7np" podStartSLOduration=83.963561862 podStartE2EDuration="2m14.256922436s" podCreationTimestamp="2026-03-14 08:30:00 +0000 UTC" firstStartedPulling="2026-03-14 08:31:22.934581715 +0000 UTC m=+218.183033352" lastFinishedPulling="2026-03-14 08:32:13.227942289 +0000 UTC m=+268.476393926" observedRunningTime="2026-03-14 08:32:14.253065135 +0000 UTC m=+269.501516772" watchObservedRunningTime="2026-03-14 08:32:14.256922436 +0000 UTC m=+269.505374073" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.290186 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4f8zx" podStartSLOduration=3.733972637 podStartE2EDuration="43.290167396s" podCreationTimestamp="2026-03-14 08:31:31 +0000 UTC" firstStartedPulling="2026-03-14 08:31:33.672354667 +0000 UTC m=+228.920806304" lastFinishedPulling="2026-03-14 08:32:13.228549426 +0000 UTC m=+268.477001063" observedRunningTime="2026-03-14 08:32:14.28858427 +0000 UTC m=+269.537035907" watchObservedRunningTime="2026-03-14 08:32:14.290167396 +0000 UTC m=+269.538619033" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.304107 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.356361 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" podStartSLOduration=7.356347116 podStartE2EDuration="7.356347116s" podCreationTimestamp="2026-03-14 08:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:14.312597443 +0000 UTC m=+269.561049080" watchObservedRunningTime="2026-03-14 08:32:14.356347116 +0000 UTC m=+269.604798753" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.357920 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" podStartSLOduration=7.357913622 podStartE2EDuration="7.357913622s" podCreationTimestamp="2026-03-14 08:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:14.354965217 +0000 UTC m=+269.603416854" watchObservedRunningTime="2026-03-14 08:32:14.357913622 +0000 UTC m=+269.606365259" Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.359911 4886 csr.go:261] certificate signing request csr-q2btv is approved, waiting to be issued Mar 14 08:32:14 crc kubenswrapper[4886]: I0314 08:32:14.371625 4886 csr.go:257] certificate signing request csr-q2btv is issued Mar 14 08:32:15 crc kubenswrapper[4886]: I0314 08:32:15.166920 4886 generic.go:334] "Generic (PLEG): container finished" podID="266a4a60-8fb5-4685-b4ac-621f93829611" containerID="fc2c9d407431805b749a61dec6c02dc9adbdbaeca0ebcaca31a7ba39a5980b81" exitCode=0 Mar 14 08:32:15 crc kubenswrapper[4886]: I0314 08:32:15.167008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-fd7np" event={"ID":"266a4a60-8fb5-4685-b4ac-621f93829611","Type":"ContainerDied","Data":"fc2c9d407431805b749a61dec6c02dc9adbdbaeca0ebcaca31a7ba39a5980b81"} Mar 14 08:32:15 crc kubenswrapper[4886]: I0314 08:32:15.169051 4886 generic.go:334] "Generic (PLEG): container finished" podID="e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9" containerID="4e58c47bdef3c8efcdad128728f90a3a67cbe7a1ed838239d371cf9fcf1740b6" exitCode=0 Mar 14 08:32:15 crc kubenswrapper[4886]: I0314 08:32:15.169129 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" event={"ID":"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9","Type":"ContainerDied","Data":"4e58c47bdef3c8efcdad128728f90a3a67cbe7a1ed838239d371cf9fcf1740b6"} Mar 14 08:32:15 crc kubenswrapper[4886]: I0314 08:32:15.373242 4886 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 05:47:10.035192171 +0000 UTC Mar 14 08:32:15 crc kubenswrapper[4886]: I0314 08:32:15.373300 4886 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7125h14m54.661894624s for next certificate rotation Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.373515 4886 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-04 03:15:52.972172077 +0000 UTC Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.373566 4886 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6354h43m36.598610151s for next certificate rotation Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.500218 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.533500 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.651046 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/266a4a60-8fb5-4685-b4ac-621f93829611-kube-api-access-wfztb\") pod \"266a4a60-8fb5-4685-b4ac-621f93829611\" (UID: \"266a4a60-8fb5-4685-b4ac-621f93829611\") " Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.651110 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58jp\" (UniqueName: \"kubernetes.io/projected/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9-kube-api-access-z58jp\") pod \"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9\" (UID: \"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9\") " Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.657907 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/266a4a60-8fb5-4685-b4ac-621f93829611-kube-api-access-wfztb" (OuterVolumeSpecName: "kube-api-access-wfztb") pod "266a4a60-8fb5-4685-b4ac-621f93829611" (UID: "266a4a60-8fb5-4685-b4ac-621f93829611"). InnerVolumeSpecName "kube-api-access-wfztb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.658423 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9-kube-api-access-z58jp" (OuterVolumeSpecName: "kube-api-access-z58jp") pod "e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9" (UID: "e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9"). InnerVolumeSpecName "kube-api-access-z58jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.752982 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfztb\" (UniqueName: \"kubernetes.io/projected/266a4a60-8fb5-4685-b4ac-621f93829611-kube-api-access-wfztb\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:16 crc kubenswrapper[4886]: I0314 08:32:16.753397 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z58jp\" (UniqueName: \"kubernetes.io/projected/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9-kube-api-access-z58jp\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:17 crc kubenswrapper[4886]: I0314 08:32:17.182093 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" event={"ID":"e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9","Type":"ContainerDied","Data":"d8749af0942e4f0228c5731a87af2a6f7411ab02ac090fa88d2074a404249139"} Mar 14 08:32:17 crc kubenswrapper[4886]: I0314 08:32:17.182167 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8749af0942e4f0228c5731a87af2a6f7411ab02ac090fa88d2074a404249139" Mar 14 08:32:17 crc kubenswrapper[4886]: I0314 08:32:17.182141 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557952-fzb8n" Mar 14 08:32:17 crc kubenswrapper[4886]: I0314 08:32:17.183799 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557950-fd7np" event={"ID":"266a4a60-8fb5-4685-b4ac-621f93829611","Type":"ContainerDied","Data":"e67e89d72e48304ef57b68d4a8664c1c86a1ca0731de077d4c22abe7ca353497"} Mar 14 08:32:17 crc kubenswrapper[4886]: I0314 08:32:17.183835 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e67e89d72e48304ef57b68d4a8664c1c86a1ca0731de077d4c22abe7ca353497" Mar 14 08:32:17 crc kubenswrapper[4886]: I0314 08:32:17.183981 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557950-fd7np" Mar 14 08:32:18 crc kubenswrapper[4886]: I0314 08:32:18.426355 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:32:18 crc kubenswrapper[4886]: I0314 08:32:18.426810 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:32:18 crc kubenswrapper[4886]: I0314 08:32:18.647893 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:32:19 crc kubenswrapper[4886]: I0314 08:32:19.243648 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:32:19 crc kubenswrapper[4886]: I0314 08:32:19.922444 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t996r"] Mar 14 08:32:22 crc kubenswrapper[4886]: I0314 08:32:22.022866 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:32:22 crc kubenswrapper[4886]: I0314 08:32:22.023447 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:32:22 crc kubenswrapper[4886]: I0314 08:32:22.092255 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:32:22 crc kubenswrapper[4886]: I0314 08:32:22.258500 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:32:22 crc kubenswrapper[4886]: I0314 08:32:22.315960 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4f8zx"] Mar 14 08:32:24 crc kubenswrapper[4886]: I0314 08:32:24.228784 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerStarted","Data":"f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a"} Mar 14 08:32:24 crc kubenswrapper[4886]: I0314 08:32:24.235557 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4f8zx" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="registry-server" containerID="cri-o://3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121" gracePeriod=2 Mar 14 08:32:24 crc kubenswrapper[4886]: I0314 08:32:24.235808 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerStarted","Data":"e6e3e6058ed138078d9874adbc444746ad88235267c989589fa8a8a2f541ef1e"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.241691 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.243570 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerStarted","Data":"9eb5275ee25c2a3c1835d62af26298205657463dc379f1a7427e0abe99ab6bbd"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.249001 4886 generic.go:334] "Generic (PLEG): container finished" podID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerID="f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a" exitCode=0 Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.249077 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerDied","Data":"f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.253940 4886 generic.go:334] "Generic (PLEG): container finished" podID="28585f97-68cd-440f-acb0-0e5bd9117023" containerID="3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121" exitCode=0 Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.254002 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerDied","Data":"3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.254026 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8zx" event={"ID":"28585f97-68cd-440f-acb0-0e5bd9117023","Type":"ContainerDied","Data":"633863570eec337ae40f5bb4bb4c03c3d41e5715d6b0e6fecc96bbb91de1274e"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.254049 4886 scope.go:117] "RemoveContainer" containerID="3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.254178 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8zx" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.257193 4886 generic.go:334] "Generic (PLEG): container finished" podID="758a60d0-6132-4b23-8062-febd479f7fff" containerID="2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470" exitCode=0 Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.257282 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snhqq" event={"ID":"758a60d0-6132-4b23-8062-febd479f7fff","Type":"ContainerDied","Data":"2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.267319 4886 generic.go:334] "Generic (PLEG): container finished" podID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerID="e6e3e6058ed138078d9874adbc444746ad88235267c989589fa8a8a2f541ef1e" exitCode=0 Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.267368 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerDied","Data":"e6e3e6058ed138078d9874adbc444746ad88235267c989589fa8a8a2f541ef1e"} Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.284216 4886 scope.go:117] "RemoveContainer" containerID="fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.284464 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-catalog-content\") pod \"28585f97-68cd-440f-acb0-0e5bd9117023\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.284495 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-utilities\") pod \"28585f97-68cd-440f-acb0-0e5bd9117023\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.284535 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjpw\" (UniqueName: \"kubernetes.io/projected/28585f97-68cd-440f-acb0-0e5bd9117023-kube-api-access-kzjpw\") pod \"28585f97-68cd-440f-acb0-0e5bd9117023\" (UID: \"28585f97-68cd-440f-acb0-0e5bd9117023\") " Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.285382 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-utilities" (OuterVolumeSpecName: "utilities") pod "28585f97-68cd-440f-acb0-0e5bd9117023" (UID: "28585f97-68cd-440f-acb0-0e5bd9117023"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.297658 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28585f97-68cd-440f-acb0-0e5bd9117023-kube-api-access-kzjpw" (OuterVolumeSpecName: "kube-api-access-kzjpw") pod "28585f97-68cd-440f-acb0-0e5bd9117023" (UID: "28585f97-68cd-440f-acb0-0e5bd9117023"). InnerVolumeSpecName "kube-api-access-kzjpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.338721 4886 scope.go:117] "RemoveContainer" containerID="f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.368247 4886 scope.go:117] "RemoveContainer" containerID="3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121" Mar 14 08:32:25 crc kubenswrapper[4886]: E0314 08:32:25.368712 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121\": container with ID starting with 3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121 not found: ID does not exist" containerID="3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.368752 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121"} err="failed to get container status \"3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121\": rpc error: code = NotFound desc = could not find container \"3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121\": container with ID starting with 3ec67252f5f5ca0af057f87e4a4f8cf5663a65987e1c8c34c4e685ffb76d2121 not found: ID does not exist" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.368780 4886 scope.go:117] "RemoveContainer" containerID="fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa" Mar 14 08:32:25 crc kubenswrapper[4886]: E0314 08:32:25.369158 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa\": container with ID starting with fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa not found: ID does not exist" containerID="fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.369187 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa"} err="failed to get container status \"fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa\": rpc error: code = NotFound desc = could not find container \"fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa\": container with ID starting with fdf6d6fc2506360836e137418cd05863b19b9a5d90deb17a8368405766a013fa not found: ID does not exist" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.369205 4886 scope.go:117] "RemoveContainer" containerID="f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db" Mar 14 08:32:25 crc kubenswrapper[4886]: E0314 08:32:25.369540 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db\": container with ID starting with f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db not found: ID does not exist" containerID="f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.369567 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db"} err="failed to get container status \"f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db\": rpc error: code = NotFound desc = could not find container \"f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db\": container with ID starting with f09af7e849067e0420a4dd0aebcf4d887430504fc831998ee947d02ccc4964db not found: ID does not exist" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.385299 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.385338 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjpw\" (UniqueName: \"kubernetes.io/projected/28585f97-68cd-440f-acb0-0e5bd9117023-kube-api-access-kzjpw\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.469848 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28585f97-68cd-440f-acb0-0e5bd9117023" (UID: "28585f97-68cd-440f-acb0-0e5bd9117023"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.486782 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28585f97-68cd-440f-acb0-0e5bd9117023-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.585968 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4f8zx"] Mar 14 08:32:25 crc kubenswrapper[4886]: I0314 08:32:25.588549 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4f8zx"] Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.066329 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.066443 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.066537 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.067898 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.068040 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a" gracePeriod=600 Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.275675 4886 generic.go:334] "Generic (PLEG): container finished" podID="98f142ad-f9c5-41ee-81ec-632938796964" containerID="2c860203d6f40c73708c979c7452ac98020bf424da2e738ee19b4753e8e74c76" exitCode=0 Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.275775 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpr95" event={"ID":"98f142ad-f9c5-41ee-81ec-632938796964","Type":"ContainerDied","Data":"2c860203d6f40c73708c979c7452ac98020bf424da2e738ee19b4753e8e74c76"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.279047 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snhqq" event={"ID":"758a60d0-6132-4b23-8062-febd479f7fff","Type":"ContainerStarted","Data":"ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.282913 4886 generic.go:334] "Generic (PLEG): container finished" podID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerID="90f215602c3b76fcbd4f3a2960d65747d178e02efa98b3b2b17a6383e525a6cc" exitCode=0 Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.282973 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfbq" event={"ID":"7c355048-396b-4f00-8ff6-1ffff1d9d62c","Type":"ContainerDied","Data":"90f215602c3b76fcbd4f3a2960d65747d178e02efa98b3b2b17a6383e525a6cc"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.288796 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerStarted","Data":"f7bb36dbc69b0696038358ed5757d50ba889fb73cea54fe439d610b9a75facd5"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.293083 4886 generic.go:334] "Generic (PLEG): container finished" podID="837659fc-08c6-4ea6-8799-aa4297b20689" containerID="9eb5275ee25c2a3c1835d62af26298205657463dc379f1a7427e0abe99ab6bbd" exitCode=0 Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.293244 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerDied","Data":"9eb5275ee25c2a3c1835d62af26298205657463dc379f1a7427e0abe99ab6bbd"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.296178 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a" exitCode=0 Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.296232 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.306550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerStarted","Data":"de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260"} Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.363249 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dk6zm" podStartSLOduration=4.013684343 podStartE2EDuration="59.363230299s" podCreationTimestamp="2026-03-14 08:31:27 +0000 UTC" firstStartedPulling="2026-03-14 08:31:30.425282465 +0000 UTC m=+225.673734102" lastFinishedPulling="2026-03-14 08:32:25.774828421 +0000 UTC m=+281.023280058" observedRunningTime="2026-03-14 08:32:26.358378489 +0000 UTC m=+281.606830126" watchObservedRunningTime="2026-03-14 08:32:26.363230299 +0000 UTC m=+281.611681936" Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.379005 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-snhqq" podStartSLOduration=3.168568036 podStartE2EDuration="58.378987044s" podCreationTimestamp="2026-03-14 08:31:28 +0000 UTC" firstStartedPulling="2026-03-14 08:31:30.466665361 +0000 UTC m=+225.715116998" lastFinishedPulling="2026-03-14 08:32:25.677084369 +0000 UTC m=+280.925536006" observedRunningTime="2026-03-14 08:32:26.378930073 +0000 UTC m=+281.627381700" watchObservedRunningTime="2026-03-14 08:32:26.378987044 +0000 UTC m=+281.627438681" Mar 14 08:32:26 crc kubenswrapper[4886]: I0314 08:32:26.418111 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vnjw4" podStartSLOduration=2.298383715 podStartE2EDuration="56.418094223s" podCreationTimestamp="2026-03-14 08:31:30 +0000 UTC" firstStartedPulling="2026-03-14 08:31:31.51887638 +0000 UTC m=+226.767328017" lastFinishedPulling="2026-03-14 08:32:25.638586888 +0000 UTC m=+280.887038525" observedRunningTime="2026-03-14 08:32:26.414394877 +0000 UTC m=+281.662846514" watchObservedRunningTime="2026-03-14 08:32:26.418094223 +0000 UTC m=+281.666545850" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.252007 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-544778d49f-75x52"] Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.252479 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" podUID="52cd2979-fdd2-4dea-b6a6-f76883c29e6f" containerName="controller-manager" containerID="cri-o://9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b" gracePeriod=30 Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.277435 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv"] Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.277931 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" podUID="a249b150-7d91-4e4b-93eb-7348a46b7dc3" containerName="route-controller-manager" containerID="cri-o://ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2" gracePeriod=30 Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.316882 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerStarted","Data":"1044645622c1f0641c5deb70e34f4f9ff19fc2e160e515fd3053462312f3b9e4"} Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.320752 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"424638e056b0f6bf732f3fa83ff43260b56297a10248482736a5bafa61119a1d"} Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.322971 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpr95" event={"ID":"98f142ad-f9c5-41ee-81ec-632938796964","Type":"ContainerStarted","Data":"093f7029785a049be2197520ad1eefec886f54abe36d623dbac5486932c2c74b"} Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.325950 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfbq" event={"ID":"7c355048-396b-4f00-8ff6-1ffff1d9d62c","Type":"ContainerStarted","Data":"0c7b8ed5b1eae85061215c343c2815e0779f4d51b5129ff92afe3aaa3552bc5c"} Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.340915 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4gbt" podStartSLOduration=3.097250624 podStartE2EDuration="59.340891966s" podCreationTimestamp="2026-03-14 08:31:28 +0000 UTC" firstStartedPulling="2026-03-14 08:31:30.432677047 +0000 UTC m=+225.681128684" lastFinishedPulling="2026-03-14 08:32:26.676318389 +0000 UTC m=+281.924770026" observedRunningTime="2026-03-14 08:32:27.33929814 +0000 UTC m=+282.587749777" watchObservedRunningTime="2026-03-14 08:32:27.340891966 +0000 UTC m=+282.589343603" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.380995 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fpr95" podStartSLOduration=3.120781235 podStartE2EDuration="58.380978493s" podCreationTimestamp="2026-03-14 08:31:29 +0000 UTC" firstStartedPulling="2026-03-14 08:31:31.485552795 +0000 UTC m=+226.734004432" lastFinishedPulling="2026-03-14 08:32:26.745750053 +0000 UTC m=+281.994201690" observedRunningTime="2026-03-14 08:32:27.379458069 +0000 UTC m=+282.627909706" watchObservedRunningTime="2026-03-14 08:32:27.380978493 +0000 UTC m=+282.629430130" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.398553 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6gfbq" podStartSLOduration=3.332092194 podStartE2EDuration="56.39853165s" podCreationTimestamp="2026-03-14 08:31:31 +0000 UTC" firstStartedPulling="2026-03-14 08:31:33.677845995 +0000 UTC m=+228.926297632" lastFinishedPulling="2026-03-14 08:32:26.744285451 +0000 UTC m=+281.992737088" observedRunningTime="2026-03-14 08:32:27.395902924 +0000 UTC m=+282.644354571" watchObservedRunningTime="2026-03-14 08:32:27.39853165 +0000 UTC m=+282.646983287" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.435630 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" path="/var/lib/kubelet/pods/28585f97-68cd-440f-acb0-0e5bd9117023/volumes" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.755759 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.919261 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-config\") pod \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.919337 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-client-ca\") pod \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.919390 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdfrj\" (UniqueName: \"kubernetes.io/projected/a249b150-7d91-4e4b-93eb-7348a46b7dc3-kube-api-access-pdfrj\") pod \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.919415 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a249b150-7d91-4e4b-93eb-7348a46b7dc3-serving-cert\") pod \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\" (UID: \"a249b150-7d91-4e4b-93eb-7348a46b7dc3\") " Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.920084 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-config" (OuterVolumeSpecName: "config") pod "a249b150-7d91-4e4b-93eb-7348a46b7dc3" (UID: "a249b150-7d91-4e4b-93eb-7348a46b7dc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.920099 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a249b150-7d91-4e4b-93eb-7348a46b7dc3" (UID: "a249b150-7d91-4e4b-93eb-7348a46b7dc3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.929253 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a249b150-7d91-4e4b-93eb-7348a46b7dc3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a249b150-7d91-4e4b-93eb-7348a46b7dc3" (UID: "a249b150-7d91-4e4b-93eb-7348a46b7dc3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.929384 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a249b150-7d91-4e4b-93eb-7348a46b7dc3-kube-api-access-pdfrj" (OuterVolumeSpecName: "kube-api-access-pdfrj") pod "a249b150-7d91-4e4b-93eb-7348a46b7dc3" (UID: "a249b150-7d91-4e4b-93eb-7348a46b7dc3"). InnerVolumeSpecName "kube-api-access-pdfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:27 crc kubenswrapper[4886]: I0314 08:32:27.950206 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.028640 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvmrf\" (UniqueName: \"kubernetes.io/projected/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-kube-api-access-vvmrf\") pod \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.028689 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-config\") pod \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.028744 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-client-ca\") pod \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.029641 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-client-ca" (OuterVolumeSpecName: "client-ca") pod "52cd2979-fdd2-4dea-b6a6-f76883c29e6f" (UID: "52cd2979-fdd2-4dea-b6a6-f76883c29e6f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.029696 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-config" (OuterVolumeSpecName: "config") pod "52cd2979-fdd2-4dea-b6a6-f76883c29e6f" (UID: "52cd2979-fdd2-4dea-b6a6-f76883c29e6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.029765 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-serving-cert\") pod \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030090 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-proxy-ca-bundles\") pod \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\" (UID: \"52cd2979-fdd2-4dea-b6a6-f76883c29e6f\") " Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030474 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "52cd2979-fdd2-4dea-b6a6-f76883c29e6f" (UID: "52cd2979-fdd2-4dea-b6a6-f76883c29e6f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030727 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030750 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdfrj\" (UniqueName: \"kubernetes.io/projected/a249b150-7d91-4e4b-93eb-7348a46b7dc3-kube-api-access-pdfrj\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030763 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a249b150-7d91-4e4b-93eb-7348a46b7dc3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030775 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030785 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030793 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.030804 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a249b150-7d91-4e4b-93eb-7348a46b7dc3-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.033317 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-kube-api-access-vvmrf" (OuterVolumeSpecName: "kube-api-access-vvmrf") pod "52cd2979-fdd2-4dea-b6a6-f76883c29e6f" (UID: "52cd2979-fdd2-4dea-b6a6-f76883c29e6f"). InnerVolumeSpecName "kube-api-access-vvmrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.033333 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "52cd2979-fdd2-4dea-b6a6-f76883c29e6f" (UID: "52cd2979-fdd2-4dea-b6a6-f76883c29e6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.132306 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvmrf\" (UniqueName: \"kubernetes.io/projected/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-kube-api-access-vvmrf\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.132351 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52cd2979-fdd2-4dea-b6a6-f76883c29e6f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.237500 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.241303 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.355883 4886 generic.go:334] "Generic (PLEG): container finished" podID="a249b150-7d91-4e4b-93eb-7348a46b7dc3" containerID="ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2" exitCode=0 Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.355949 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" event={"ID":"a249b150-7d91-4e4b-93eb-7348a46b7dc3","Type":"ContainerDied","Data":"ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2"} Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.355978 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" event={"ID":"a249b150-7d91-4e4b-93eb-7348a46b7dc3","Type":"ContainerDied","Data":"7ab0f1d8a418e8ff8b03f0e0c5273a722168aeac22b9bfa51196fa52826a4331"} Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.355997 4886 scope.go:117] "RemoveContainer" containerID="ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.356104 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.359317 4886 generic.go:334] "Generic (PLEG): container finished" podID="52cd2979-fdd2-4dea-b6a6-f76883c29e6f" containerID="9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b" exitCode=0 Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.359914 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.359978 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" event={"ID":"52cd2979-fdd2-4dea-b6a6-f76883c29e6f","Type":"ContainerDied","Data":"9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b"} Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.360014 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-544778d49f-75x52" event={"ID":"52cd2979-fdd2-4dea-b6a6-f76883c29e6f","Type":"ContainerDied","Data":"e432344a62b2df737ae68cecf246d8dea34103991930e35fc985d11500b79323"} Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.377059 4886 scope.go:117] "RemoveContainer" containerID="ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.377701 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2\": container with ID starting with ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2 not found: ID does not exist" containerID="ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.377749 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2"} err="failed to get container status \"ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2\": rpc error: code = NotFound desc = could not find container \"ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2\": container with ID starting with ebe3da7d060dac1850cc14f2b6fbbd81abae53b70075427b35bd3a78ea87e6e2 not found: ID does not exist" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.377777 4886 scope.go:117] "RemoveContainer" containerID="9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.390401 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.391561 4886 scope.go:117] "RemoveContainer" containerID="9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.392065 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b\": container with ID starting with 9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b not found: ID does not exist" containerID="9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.392101 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b"} err="failed to get container status \"9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b\": rpc error: code = NotFound desc = could not find container \"9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b\": container with ID starting with 9d9cb0f104cce51103a42c8712cc52e696897afa4c07de3a451ba32dd34f3e7b not found: ID does not exist" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.393238 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7969dd5697-h7bdv"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.403496 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-544778d49f-75x52"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.405661 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-544778d49f-75x52"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.762590 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dcd48867-cthq8"] Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.762918 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a249b150-7d91-4e4b-93eb-7348a46b7dc3" containerName="route-controller-manager" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.762930 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a249b150-7d91-4e4b-93eb-7348a46b7dc3" containerName="route-controller-manager" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.762948 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="registry-server" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.762955 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="registry-server" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.762970 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="extract-content" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.762977 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="extract-content" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.762988 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="266a4a60-8fb5-4685-b4ac-621f93829611" containerName="oc" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.762994 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="266a4a60-8fb5-4685-b4ac-621f93829611" containerName="oc" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.763007 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551df435-0815-4f88-966b-3e60ef48aaa4" containerName="pruner" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763014 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="551df435-0815-4f88-966b-3e60ef48aaa4" containerName="pruner" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.763027 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="extract-utilities" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763034 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="extract-utilities" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.763044 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cd2979-fdd2-4dea-b6a6-f76883c29e6f" containerName="controller-manager" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763051 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cd2979-fdd2-4dea-b6a6-f76883c29e6f" containerName="controller-manager" Mar 14 08:32:28 crc kubenswrapper[4886]: E0314 08:32:28.763061 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9" containerName="oc" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763069 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9" containerName="oc" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763195 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="28585f97-68cd-440f-acb0-0e5bd9117023" containerName="registry-server" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763213 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cd2979-fdd2-4dea-b6a6-f76883c29e6f" containerName="controller-manager" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763221 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9" containerName="oc" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763234 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="551df435-0815-4f88-966b-3e60ef48aaa4" containerName="pruner" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763245 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a249b150-7d91-4e4b-93eb-7348a46b7dc3" containerName="route-controller-manager" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763255 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="266a4a60-8fb5-4685-b4ac-621f93829611" containerName="oc" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.763793 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.766948 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.768285 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.768334 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.768333 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.771315 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.771621 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.772745 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.773949 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.776567 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.776905 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.777100 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.777190 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.777316 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.779044 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd48867-cthq8"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.780356 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.782743 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48"] Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.783213 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.790187 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.791377 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.823471 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.823511 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.839982 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c02e5d-5af8-43a0-8485-d2c5f2344525-serving-cert\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840038 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhflf\" (UniqueName: \"kubernetes.io/projected/07c02e5d-5af8-43a0-8485-d2c5f2344525-kube-api-access-nhflf\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840070 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-client-ca\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840091 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-config\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-config\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840193 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmg2r\" (UniqueName: \"kubernetes.io/projected/d13d372b-e10b-4bf8-bf54-99836e06cc85-kube-api-access-bmg2r\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840209 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-proxy-ca-bundles\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840231 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-client-ca\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.840262 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13d372b-e10b-4bf8-bf54-99836e06cc85-serving-cert\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.842239 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941597 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13d372b-e10b-4bf8-bf54-99836e06cc85-serving-cert\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941657 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c02e5d-5af8-43a0-8485-d2c5f2344525-serving-cert\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941693 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhflf\" (UniqueName: \"kubernetes.io/projected/07c02e5d-5af8-43a0-8485-d2c5f2344525-kube-api-access-nhflf\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941723 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-client-ca\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941745 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-config\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941786 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-config\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941813 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmg2r\" (UniqueName: \"kubernetes.io/projected/d13d372b-e10b-4bf8-bf54-99836e06cc85-kube-api-access-bmg2r\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941827 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-proxy-ca-bundles\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.941850 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-client-ca\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.943242 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-config\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.943251 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-client-ca\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.943516 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-config\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.943811 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-proxy-ca-bundles\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.944065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-client-ca\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.947003 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c02e5d-5af8-43a0-8485-d2c5f2344525-serving-cert\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.956901 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13d372b-e10b-4bf8-bf54-99836e06cc85-serving-cert\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.961488 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmg2r\" (UniqueName: \"kubernetes.io/projected/d13d372b-e10b-4bf8-bf54-99836e06cc85-kube-api-access-bmg2r\") pod \"controller-manager-6dcd48867-cthq8\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:28 crc kubenswrapper[4886]: I0314 08:32:28.964812 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhflf\" (UniqueName: \"kubernetes.io/projected/07c02e5d-5af8-43a0-8485-d2c5f2344525-kube-api-access-nhflf\") pod \"route-controller-manager-74d86c6ff7-kbd48\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.081426 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.095975 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.293770 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dk6zm" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="registry-server" probeResult="failure" output=< Mar 14 08:32:29 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:32:29 crc kubenswrapper[4886]: > Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.431307 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cd2979-fdd2-4dea-b6a6-f76883c29e6f" path="/var/lib/kubelet/pods/52cd2979-fdd2-4dea-b6a6-f76883c29e6f/volumes" Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.432011 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a249b150-7d91-4e4b-93eb-7348a46b7dc3" path="/var/lib/kubelet/pods/a249b150-7d91-4e4b-93eb-7348a46b7dc3/volumes" Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.546513 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd48867-cthq8"] Mar 14 08:32:29 crc kubenswrapper[4886]: W0314 08:32:29.552838 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13d372b_e10b_4bf8_bf54_99836e06cc85.slice/crio-e65f423afb73803c548464bf241167224d538e2943a90d8800230842bc7e3f4a WatchSource:0}: Error finding container e65f423afb73803c548464bf241167224d538e2943a90d8800230842bc7e3f4a: Status 404 returned error can't find the container with id e65f423afb73803c548464bf241167224d538e2943a90d8800230842bc7e3f4a Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.591649 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48"] Mar 14 08:32:29 crc kubenswrapper[4886]: I0314 08:32:29.864628 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t4gbt" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="registry-server" probeResult="failure" output=< Mar 14 08:32:29 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:32:29 crc kubenswrapper[4886]: > Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.266245 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.266959 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.319337 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.377646 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" event={"ID":"d13d372b-e10b-4bf8-bf54-99836e06cc85","Type":"ContainerStarted","Data":"5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e"} Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.377725 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" event={"ID":"d13d372b-e10b-4bf8-bf54-99836e06cc85","Type":"ContainerStarted","Data":"e65f423afb73803c548464bf241167224d538e2943a90d8800230842bc7e3f4a"} Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.377754 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.381773 4886 patch_prober.go:28] interesting pod/controller-manager-6dcd48867-cthq8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.381844 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" podUID="d13d372b-e10b-4bf8-bf54-99836e06cc85" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.382752 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" event={"ID":"07c02e5d-5af8-43a0-8485-d2c5f2344525","Type":"ContainerStarted","Data":"86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4"} Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.382846 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" event={"ID":"07c02e5d-5af8-43a0-8485-d2c5f2344525","Type":"ContainerStarted","Data":"c523b126af487cfb6d6edd32f887b2f54282710d37930b111a88cfe257158bcb"} Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.383906 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.400660 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" podStartSLOduration=3.400638454 podStartE2EDuration="3.400638454s" podCreationTimestamp="2026-03-14 08:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:30.399847391 +0000 UTC m=+285.648299028" watchObservedRunningTime="2026-03-14 08:32:30.400638454 +0000 UTC m=+285.649090091" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.426366 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" podStartSLOduration=3.426344716 podStartE2EDuration="3.426344716s" podCreationTimestamp="2026-03-14 08:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:32:30.424821443 +0000 UTC m=+285.673273080" watchObservedRunningTime="2026-03-14 08:32:30.426344716 +0000 UTC m=+285.674796353" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.439015 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.559194 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.559285 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.600436 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.818320 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:32:30 crc kubenswrapper[4886]: I0314 08:32:30.917843 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snhqq"] Mar 14 08:32:31 crc kubenswrapper[4886]: I0314 08:32:31.399896 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:32:31 crc kubenswrapper[4886]: I0314 08:32:31.465603 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:32:31 crc kubenswrapper[4886]: I0314 08:32:31.469039 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:32:31 crc kubenswrapper[4886]: I0314 08:32:31.561803 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:32:31 crc kubenswrapper[4886]: I0314 08:32:31.561876 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:32:32 crc kubenswrapper[4886]: I0314 08:32:32.396448 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-snhqq" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="registry-server" containerID="cri-o://ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2" gracePeriod=2 Mar 14 08:32:32 crc kubenswrapper[4886]: I0314 08:32:32.600266 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6gfbq" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="registry-server" probeResult="failure" output=< Mar 14 08:32:32 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:32:32 crc kubenswrapper[4886]: > Mar 14 08:32:32 crc kubenswrapper[4886]: I0314 08:32:32.877371 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.010479 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-utilities\") pod \"758a60d0-6132-4b23-8062-febd479f7fff\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.010641 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-catalog-content\") pod \"758a60d0-6132-4b23-8062-febd479f7fff\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.010690 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gspdc\" (UniqueName: \"kubernetes.io/projected/758a60d0-6132-4b23-8062-febd479f7fff-kube-api-access-gspdc\") pod \"758a60d0-6132-4b23-8062-febd479f7fff\" (UID: \"758a60d0-6132-4b23-8062-febd479f7fff\") " Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.011370 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-utilities" (OuterVolumeSpecName: "utilities") pod "758a60d0-6132-4b23-8062-febd479f7fff" (UID: "758a60d0-6132-4b23-8062-febd479f7fff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.025342 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758a60d0-6132-4b23-8062-febd479f7fff-kube-api-access-gspdc" (OuterVolumeSpecName: "kube-api-access-gspdc") pod "758a60d0-6132-4b23-8062-febd479f7fff" (UID: "758a60d0-6132-4b23-8062-febd479f7fff"). InnerVolumeSpecName "kube-api-access-gspdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.069176 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "758a60d0-6132-4b23-8062-febd479f7fff" (UID: "758a60d0-6132-4b23-8062-febd479f7fff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.112172 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.112214 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gspdc\" (UniqueName: \"kubernetes.io/projected/758a60d0-6132-4b23-8062-febd479f7fff-kube-api-access-gspdc\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.112227 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758a60d0-6132-4b23-8062-febd479f7fff-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.402830 4886 generic.go:334] "Generic (PLEG): container finished" podID="758a60d0-6132-4b23-8062-febd479f7fff" containerID="ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2" exitCode=0 Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.402875 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snhqq" event={"ID":"758a60d0-6132-4b23-8062-febd479f7fff","Type":"ContainerDied","Data":"ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2"} Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.402894 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snhqq" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.402911 4886 scope.go:117] "RemoveContainer" containerID="ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.402901 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snhqq" event={"ID":"758a60d0-6132-4b23-8062-febd479f7fff","Type":"ContainerDied","Data":"3e3ff19467f1b986bd0bbd85b7d6035a94ec405b102d536647c6bbc6170aa850"} Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.424637 4886 scope.go:117] "RemoveContainer" containerID="2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.441247 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snhqq"] Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.441761 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-snhqq"] Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.466496 4886 scope.go:117] "RemoveContainer" containerID="941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.484908 4886 scope.go:117] "RemoveContainer" containerID="ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2" Mar 14 08:32:33 crc kubenswrapper[4886]: E0314 08:32:33.485288 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2\": container with ID starting with ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2 not found: ID does not exist" containerID="ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.485326 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2"} err="failed to get container status \"ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2\": rpc error: code = NotFound desc = could not find container \"ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2\": container with ID starting with ccafddb36e06870b9bcd541c2f790a2cb44ed4c8eff1813feb9c2944f1d388d2 not found: ID does not exist" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.485349 4886 scope.go:117] "RemoveContainer" containerID="2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470" Mar 14 08:32:33 crc kubenswrapper[4886]: E0314 08:32:33.485708 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470\": container with ID starting with 2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470 not found: ID does not exist" containerID="2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.485739 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470"} err="failed to get container status \"2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470\": rpc error: code = NotFound desc = could not find container \"2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470\": container with ID starting with 2ac63fef9404de6a6649105e4f6caacbcc8ce1c2baa97da604b0226dd7112470 not found: ID does not exist" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.485756 4886 scope.go:117] "RemoveContainer" containerID="941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe" Mar 14 08:32:33 crc kubenswrapper[4886]: E0314 08:32:33.485982 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe\": container with ID starting with 941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe not found: ID does not exist" containerID="941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe" Mar 14 08:32:33 crc kubenswrapper[4886]: I0314 08:32:33.486010 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe"} err="failed to get container status \"941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe\": rpc error: code = NotFound desc = could not find container \"941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe\": container with ID starting with 941a56894f8416dbe65667526b6a8c49a9e984f91e0be3c18790f4ab1e9de2fe not found: ID does not exist" Mar 14 08:32:34 crc kubenswrapper[4886]: I0314 08:32:34.723889 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnjw4"] Mar 14 08:32:34 crc kubenswrapper[4886]: I0314 08:32:34.724849 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vnjw4" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="registry-server" containerID="cri-o://de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260" gracePeriod=2 Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.260895 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.425086 4886 generic.go:334] "Generic (PLEG): container finished" podID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerID="de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260" exitCode=0 Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.426854 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnjw4" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.435183 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758a60d0-6132-4b23-8062-febd479f7fff" path="/var/lib/kubelet/pods/758a60d0-6132-4b23-8062-febd479f7fff/volumes" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.441049 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerDied","Data":"de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260"} Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.441146 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnjw4" event={"ID":"9306248c-2771-4cb6-bdd4-f8628c2b6428","Type":"ContainerDied","Data":"364d714d099fe1f3cf62a5d24d6fee366c8e77efbd1718e4874a322280cfb654"} Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.441214 4886 scope.go:117] "RemoveContainer" containerID="de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.445480 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-utilities\") pod \"9306248c-2771-4cb6-bdd4-f8628c2b6428\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.445590 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbwq\" (UniqueName: \"kubernetes.io/projected/9306248c-2771-4cb6-bdd4-f8628c2b6428-kube-api-access-zwbwq\") pod \"9306248c-2771-4cb6-bdd4-f8628c2b6428\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.445697 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-catalog-content\") pod \"9306248c-2771-4cb6-bdd4-f8628c2b6428\" (UID: \"9306248c-2771-4cb6-bdd4-f8628c2b6428\") " Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.447615 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-utilities" (OuterVolumeSpecName: "utilities") pod "9306248c-2771-4cb6-bdd4-f8628c2b6428" (UID: "9306248c-2771-4cb6-bdd4-f8628c2b6428"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.455327 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9306248c-2771-4cb6-bdd4-f8628c2b6428-kube-api-access-zwbwq" (OuterVolumeSpecName: "kube-api-access-zwbwq") pod "9306248c-2771-4cb6-bdd4-f8628c2b6428" (UID: "9306248c-2771-4cb6-bdd4-f8628c2b6428"). InnerVolumeSpecName "kube-api-access-zwbwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.464084 4886 scope.go:117] "RemoveContainer" containerID="f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.489469 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9306248c-2771-4cb6-bdd4-f8628c2b6428" (UID: "9306248c-2771-4cb6-bdd4-f8628c2b6428"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.496265 4886 scope.go:117] "RemoveContainer" containerID="70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.519860 4886 scope.go:117] "RemoveContainer" containerID="de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260" Mar 14 08:32:35 crc kubenswrapper[4886]: E0314 08:32:35.520758 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260\": container with ID starting with de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260 not found: ID does not exist" containerID="de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.520943 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260"} err="failed to get container status \"de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260\": rpc error: code = NotFound desc = could not find container \"de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260\": container with ID starting with de3c5d2f1258f70a67259e08fa6e67a97f78a199392821ef34e266caaa980260 not found: ID does not exist" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.521045 4886 scope.go:117] "RemoveContainer" containerID="f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a" Mar 14 08:32:35 crc kubenswrapper[4886]: E0314 08:32:35.521757 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a\": container with ID starting with f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a not found: ID does not exist" containerID="f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.521815 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a"} err="failed to get container status \"f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a\": rpc error: code = NotFound desc = could not find container \"f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a\": container with ID starting with f390f72bff34cce7ef3818e558cd67c3ab3f36fb5d989959902c29c7252d3e8a not found: ID does not exist" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.521851 4886 scope.go:117] "RemoveContainer" containerID="70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf" Mar 14 08:32:35 crc kubenswrapper[4886]: E0314 08:32:35.523012 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf\": container with ID starting with 70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf not found: ID does not exist" containerID="70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.523114 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf"} err="failed to get container status \"70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf\": rpc error: code = NotFound desc = could not find container \"70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf\": container with ID starting with 70a9b00fa6771a2ac38df05c1f97346612bae19c4e3eef581dd325d3e8fdabdf not found: ID does not exist" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.548603 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.548655 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbwq\" (UniqueName: \"kubernetes.io/projected/9306248c-2771-4cb6-bdd4-f8628c2b6428-kube-api-access-zwbwq\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.548673 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9306248c-2771-4cb6-bdd4-f8628c2b6428-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.760992 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnjw4"] Mar 14 08:32:35 crc kubenswrapper[4886]: I0314 08:32:35.769598 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnjw4"] Mar 14 08:32:37 crc kubenswrapper[4886]: I0314 08:32:37.443611 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" path="/var/lib/kubelet/pods/9306248c-2771-4cb6-bdd4-f8628c2b6428/volumes" Mar 14 08:32:38 crc kubenswrapper[4886]: I0314 08:32:38.297305 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:32:38 crc kubenswrapper[4886]: I0314 08:32:38.340376 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:32:38 crc kubenswrapper[4886]: I0314 08:32:38.866054 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:32:38 crc kubenswrapper[4886]: I0314 08:32:38.927498 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.118513 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4gbt"] Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.118742 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4gbt" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="registry-server" containerID="cri-o://1044645622c1f0641c5deb70e34f4f9ff19fc2e160e515fd3053462312f3b9e4" gracePeriod=2 Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.473811 4886 generic.go:334] "Generic (PLEG): container finished" podID="837659fc-08c6-4ea6-8799-aa4297b20689" containerID="1044645622c1f0641c5deb70e34f4f9ff19fc2e160e515fd3053462312f3b9e4" exitCode=0 Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.473915 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerDied","Data":"1044645622c1f0641c5deb70e34f4f9ff19fc2e160e515fd3053462312f3b9e4"} Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.540914 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.615968 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.648013 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-catalog-content\") pod \"837659fc-08c6-4ea6-8799-aa4297b20689\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.648167 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7fk8\" (UniqueName: \"kubernetes.io/projected/837659fc-08c6-4ea6-8799-aa4297b20689-kube-api-access-n7fk8\") pod \"837659fc-08c6-4ea6-8799-aa4297b20689\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.648267 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-utilities\") pod \"837659fc-08c6-4ea6-8799-aa4297b20689\" (UID: \"837659fc-08c6-4ea6-8799-aa4297b20689\") " Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.650047 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-utilities" (OuterVolumeSpecName: "utilities") pod "837659fc-08c6-4ea6-8799-aa4297b20689" (UID: "837659fc-08c6-4ea6-8799-aa4297b20689"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.655668 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837659fc-08c6-4ea6-8799-aa4297b20689-kube-api-access-n7fk8" (OuterVolumeSpecName: "kube-api-access-n7fk8") pod "837659fc-08c6-4ea6-8799-aa4297b20689" (UID: "837659fc-08c6-4ea6-8799-aa4297b20689"). InnerVolumeSpecName "kube-api-access-n7fk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.676536 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.740782 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "837659fc-08c6-4ea6-8799-aa4297b20689" (UID: "837659fc-08c6-4ea6-8799-aa4297b20689"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.749955 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.749996 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/837659fc-08c6-4ea6-8799-aa4297b20689-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:41 crc kubenswrapper[4886]: I0314 08:32:41.750014 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7fk8\" (UniqueName: \"kubernetes.io/projected/837659fc-08c6-4ea6-8799-aa4297b20689-kube-api-access-n7fk8\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.493313 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4gbt" Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.493946 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4gbt" event={"ID":"837659fc-08c6-4ea6-8799-aa4297b20689","Type":"ContainerDied","Data":"663cbe6cd1053881b64be17e64ba7071aaf6ef0553582f93d804b9844e7b792c"} Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.494012 4886 scope.go:117] "RemoveContainer" containerID="1044645622c1f0641c5deb70e34f4f9ff19fc2e160e515fd3053462312f3b9e4" Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.528544 4886 scope.go:117] "RemoveContainer" containerID="9eb5275ee25c2a3c1835d62af26298205657463dc379f1a7427e0abe99ab6bbd" Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.545891 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4gbt"] Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.549812 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4gbt"] Mar 14 08:32:42 crc kubenswrapper[4886]: I0314 08:32:42.570931 4886 scope.go:117] "RemoveContainer" containerID="02f6f0e3d0573881cf8e838e35f2e366725f66967bf323d5a339d61a6c817d2b" Mar 14 08:32:43 crc kubenswrapper[4886]: I0314 08:32:43.439783 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" path="/var/lib/kubelet/pods/837659fc-08c6-4ea6-8799-aa4297b20689/volumes" Mar 14 08:32:44 crc kubenswrapper[4886]: I0314 08:32:44.957714 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" podUID="8b226bf0-ae7d-435b-9470-70dfb371f38e" containerName="oauth-openshift" containerID="cri-o://b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8" gracePeriod=15 Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.483371 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.515254 4886 generic.go:334] "Generic (PLEG): container finished" podID="8b226bf0-ae7d-435b-9470-70dfb371f38e" containerID="b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8" exitCode=0 Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.515300 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" event={"ID":"8b226bf0-ae7d-435b-9470-70dfb371f38e","Type":"ContainerDied","Data":"b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8"} Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.515342 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" event={"ID":"8b226bf0-ae7d-435b-9470-70dfb371f38e","Type":"ContainerDied","Data":"e44852c656ca2743103ecf45b0d067d5dd9cdffa3433bc552951cdd4d80b4990"} Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.515364 4886 scope.go:117] "RemoveContainer" containerID="b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.515376 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t996r" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.531938 4886 scope.go:117] "RemoveContainer" containerID="b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8" Mar 14 08:32:45 crc kubenswrapper[4886]: E0314 08:32:45.532698 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8\": container with ID starting with b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8 not found: ID does not exist" containerID="b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.532757 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8"} err="failed to get container status \"b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8\": rpc error: code = NotFound desc = could not find container \"b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8\": container with ID starting with b938574fad3d8e8ee6f71e10837850301ee4b4ede5c216d5309900ea29b6f0c8 not found: ID does not exist" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610231 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-idp-0-file-data\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610596 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-cliconfig\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610676 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-trusted-ca-bundle\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610697 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-session\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610727 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-dir\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610745 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-policies\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610779 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-error\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610775 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610804 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-serving-cert\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610854 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km96w\" (UniqueName: \"kubernetes.io/projected/8b226bf0-ae7d-435b-9470-70dfb371f38e-kube-api-access-km96w\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610893 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-router-certs\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610912 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-provider-selection\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610934 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-ocp-branding-template\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610968 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-service-ca\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.610997 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-login\") pod \"8b226bf0-ae7d-435b-9470-70dfb371f38e\" (UID: \"8b226bf0-ae7d-435b-9470-70dfb371f38e\") " Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.611264 4886 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.611503 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.611525 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.613409 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.613827 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.622368 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.622601 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.622778 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b226bf0-ae7d-435b-9470-70dfb371f38e-kube-api-access-km96w" (OuterVolumeSpecName: "kube-api-access-km96w") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "kube-api-access-km96w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.623927 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.624824 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.625409 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.638286 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.638664 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.638925 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8b226bf0-ae7d-435b-9470-70dfb371f38e" (UID: "8b226bf0-ae7d-435b-9470-70dfb371f38e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712838 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712877 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712891 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712901 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712912 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712923 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712932 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712941 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712949 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712960 4886 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8b226bf0-ae7d-435b-9470-70dfb371f38e-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712968 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.712992 4886 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b226bf0-ae7d-435b-9470-70dfb371f38e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.713001 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km96w\" (UniqueName: \"kubernetes.io/projected/8b226bf0-ae7d-435b-9470-70dfb371f38e-kube-api-access-km96w\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.838765 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t996r"] Mar 14 08:32:45 crc kubenswrapper[4886]: I0314 08:32:45.843227 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t996r"] Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083478 4886 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083783 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="extract-content" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083802 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="extract-content" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083815 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083824 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083839 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="extract-content" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083845 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="extract-content" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083855 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="extract-utilities" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083863 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="extract-utilities" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083874 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="extract-utilities" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083882 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="extract-utilities" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083893 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083900 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083911 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083919 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083929 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="extract-content" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083937 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="extract-content" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083945 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="extract-utilities" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083951 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="extract-utilities" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.083965 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b226bf0-ae7d-435b-9470-70dfb371f38e" containerName="oauth-openshift" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.083972 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b226bf0-ae7d-435b-9470-70dfb371f38e" containerName="oauth-openshift" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.084070 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b226bf0-ae7d-435b-9470-70dfb371f38e" containerName="oauth-openshift" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.084082 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="837659fc-08c6-4ea6-8799-aa4297b20689" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.084093 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9306248c-2771-4cb6-bdd4-f8628c2b6428" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.084100 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="758a60d0-6132-4b23-8062-febd479f7fff" containerName="registry-server" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085046 4886 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085352 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085605 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8" gracePeriod=15 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085587 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3" gracePeriod=15 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085596 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1" gracePeriod=15 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085656 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4" gracePeriod=15 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.085539 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c" gracePeriod=15 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086582 4886 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086805 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086820 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086829 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086836 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086846 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086856 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086868 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086875 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086884 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086892 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086902 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086910 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086919 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086927 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.086935 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.086943 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087071 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087084 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087134 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087145 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087154 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087167 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087185 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087194 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.087306 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087317 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: E0314 08:32:47.087331 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087338 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.087452 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.229820 4886 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.229893 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234274 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234325 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234389 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234411 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234430 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234492 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234510 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.234532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.336944 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337030 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337196 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337218 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337238 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337181 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337241 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337317 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337359 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337429 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337455 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337458 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337506 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337554 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337505 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.337632 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.426921 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b226bf0-ae7d-435b-9470-70dfb371f38e" path="/var/lib/kubelet/pods/8b226bf0-ae7d-435b-9470-70dfb371f38e/volumes" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.528533 4886 generic.go:334] "Generic (PLEG): container finished" podID="c85c6296-9423-4f0a-b5cd-d265dd437a77" containerID="97792cb76d0c416b5b962919c01bf15eeacc427b9746436e279b6b3adc5eb281" exitCode=0 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.528622 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c85c6296-9423-4f0a-b5cd-d265dd437a77","Type":"ContainerDied","Data":"97792cb76d0c416b5b962919c01bf15eeacc427b9746436e279b6b3adc5eb281"} Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.529770 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.531289 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.532585 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.533283 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3" exitCode=0 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.533314 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4" exitCode=0 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.533325 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1" exitCode=0 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.533336 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8" exitCode=2 Mar 14 08:32:47 crc kubenswrapper[4886]: I0314 08:32:47.533409 4886 scope.go:117] "RemoveContainer" containerID="a2e52264d020fcc8912c418ce710273ec7a0765db9c7e6b95cb1e2f5112d412e" Mar 14 08:32:48 crc kubenswrapper[4886]: I0314 08:32:48.545385 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 08:32:48 crc kubenswrapper[4886]: I0314 08:32:48.934350 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:48 crc kubenswrapper[4886]: I0314 08:32:48.935575 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.059979 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-kubelet-dir\") pod \"c85c6296-9423-4f0a-b5cd-d265dd437a77\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.060109 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-var-lock\") pod \"c85c6296-9423-4f0a-b5cd-d265dd437a77\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.060168 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c85c6296-9423-4f0a-b5cd-d265dd437a77" (UID: "c85c6296-9423-4f0a-b5cd-d265dd437a77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.060200 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-var-lock" (OuterVolumeSpecName: "var-lock") pod "c85c6296-9423-4f0a-b5cd-d265dd437a77" (UID: "c85c6296-9423-4f0a-b5cd-d265dd437a77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.060246 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c85c6296-9423-4f0a-b5cd-d265dd437a77-kube-api-access\") pod \"c85c6296-9423-4f0a-b5cd-d265dd437a77\" (UID: \"c85c6296-9423-4f0a-b5cd-d265dd437a77\") " Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.060619 4886 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.060646 4886 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c85c6296-9423-4f0a-b5cd-d265dd437a77-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.065525 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85c6296-9423-4f0a-b5cd-d265dd437a77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c85c6296-9423-4f0a-b5cd-d265dd437a77" (UID: "c85c6296-9423-4f0a-b5cd-d265dd437a77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.169773 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c85c6296-9423-4f0a-b5cd-d265dd437a77-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.439358 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.440289 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.440860 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.441075 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.475594 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.475709 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.475729 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.475766 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.475767 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.475845 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.476187 4886 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.476212 4886 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.476227 4886 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.557439 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.558203 4886 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c" exitCode=0 Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.558284 4886 scope.go:117] "RemoveContainer" containerID="bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.558295 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.560295 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c85c6296-9423-4f0a-b5cd-d265dd437a77","Type":"ContainerDied","Data":"6b5bab7d264a11a721749c8c104958c691deba16c49471bbf7d769457e1deeea"} Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.560330 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b5bab7d264a11a721749c8c104958c691deba16c49471bbf7d769457e1deeea" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.560437 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.565329 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.565560 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.571472 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.571969 4886 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.572072 4886 scope.go:117] "RemoveContainer" containerID="cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.585619 4886 scope.go:117] "RemoveContainer" containerID="4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.597437 4886 scope.go:117] "RemoveContainer" containerID="8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.607664 4886 scope.go:117] "RemoveContainer" containerID="27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.619926 4886 scope.go:117] "RemoveContainer" containerID="3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.636027 4886 scope.go:117] "RemoveContainer" containerID="bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3" Mar 14 08:32:49 crc kubenswrapper[4886]: E0314 08:32:49.636356 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\": container with ID starting with bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3 not found: ID does not exist" containerID="bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.636388 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3"} err="failed to get container status \"bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\": rpc error: code = NotFound desc = could not find container \"bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3\": container with ID starting with bf7a1c85d73654d4f738bf3a153bb99dd30e6a3fc651f22246d22f9d67054cd3 not found: ID does not exist" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.636411 4886 scope.go:117] "RemoveContainer" containerID="cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4" Mar 14 08:32:49 crc kubenswrapper[4886]: E0314 08:32:49.636690 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\": container with ID starting with cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4 not found: ID does not exist" containerID="cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.636725 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4"} err="failed to get container status \"cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\": rpc error: code = NotFound desc = could not find container \"cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4\": container with ID starting with cf11d87e6c97e7337347cd95138fd0941d8be40cfc6911c3e705ac884b99eef4 not found: ID does not exist" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.636739 4886 scope.go:117] "RemoveContainer" containerID="4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1" Mar 14 08:32:49 crc kubenswrapper[4886]: E0314 08:32:49.636968 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\": container with ID starting with 4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1 not found: ID does not exist" containerID="4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.636991 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1"} err="failed to get container status \"4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\": rpc error: code = NotFound desc = could not find container \"4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1\": container with ID starting with 4828c05b732d2ee316831edf2915d23781be4b152282398224323fca860930c1 not found: ID does not exist" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.637004 4886 scope.go:117] "RemoveContainer" containerID="8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8" Mar 14 08:32:49 crc kubenswrapper[4886]: E0314 08:32:49.637278 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\": container with ID starting with 8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8 not found: ID does not exist" containerID="8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.637302 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8"} err="failed to get container status \"8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\": rpc error: code = NotFound desc = could not find container \"8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8\": container with ID starting with 8f4663524315099a63ca6a04dea270ee5127c6df2648a8ce891a35493e90dca8 not found: ID does not exist" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.637316 4886 scope.go:117] "RemoveContainer" containerID="27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c" Mar 14 08:32:49 crc kubenswrapper[4886]: E0314 08:32:49.637540 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\": container with ID starting with 27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c not found: ID does not exist" containerID="27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.637587 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c"} err="failed to get container status \"27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\": rpc error: code = NotFound desc = could not find container \"27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c\": container with ID starting with 27f089ba9e85f971ae4fe37d22147eb95a7015cdeb868c227a0dbd74561f912c not found: ID does not exist" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.637636 4886 scope.go:117] "RemoveContainer" containerID="3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b" Mar 14 08:32:49 crc kubenswrapper[4886]: E0314 08:32:49.637901 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\": container with ID starting with 3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b not found: ID does not exist" containerID="3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b" Mar 14 08:32:49 crc kubenswrapper[4886]: I0314 08:32:49.637925 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b"} err="failed to get container status \"3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\": rpc error: code = NotFound desc = could not find container \"3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b\": container with ID starting with 3b5eb7fabad8e17a1b1c53b2d1fb02f213d1d6d3c7908d290b43104e03c2ae0b not found: ID does not exist" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.719986 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.720419 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.720960 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.721546 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.721890 4886 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:50 crc kubenswrapper[4886]: I0314 08:32:50.721919 4886 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.722213 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="200ms" Mar 14 08:32:50 crc kubenswrapper[4886]: E0314 08:32:50.923897 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="400ms" Mar 14 08:32:51 crc kubenswrapper[4886]: E0314 08:32:51.325415 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="800ms" Mar 14 08:32:51 crc kubenswrapper[4886]: I0314 08:32:51.426921 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 14 08:32:52 crc kubenswrapper[4886]: E0314 08:32:52.116355 4886 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:52 crc kubenswrapper[4886]: I0314 08:32:52.116701 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:52 crc kubenswrapper[4886]: E0314 08:32:52.126246 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="1.6s" Mar 14 08:32:52 crc kubenswrapper[4886]: E0314 08:32:52.149464 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca8177f343e71 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:32:52.148977265 +0000 UTC m=+307.397428892,LastTimestamp:2026-03-14 08:32:52.148977265 +0000 UTC m=+307.397428892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:32:52 crc kubenswrapper[4886]: I0314 08:32:52.578444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eb02a0483f76592e5a2c186501e140eac223a82358c926d89dd8246eaef79d49"} Mar 14 08:32:52 crc kubenswrapper[4886]: I0314 08:32:52.578929 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2ed8f09e9b39cf4eb8f0eae40046206f1c4066bf3d1c84d2e406e1dd0e29ae67"} Mar 14 08:32:52 crc kubenswrapper[4886]: E0314 08:32:52.579635 4886 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:32:52 crc kubenswrapper[4886]: I0314 08:32:52.579802 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:53 crc kubenswrapper[4886]: E0314 08:32:53.727591 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="3.2s" Mar 14 08:32:55 crc kubenswrapper[4886]: I0314 08:32:55.425053 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:32:56 crc kubenswrapper[4886]: E0314 08:32:56.929796 4886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.38:6443: connect: connection refused" interval="6.4s" Mar 14 08:32:58 crc kubenswrapper[4886]: E0314 08:32:58.207710 4886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.38:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ca8177f343e71 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-14 08:32:52.148977265 +0000 UTC m=+307.397428892,LastTimestamp:2026-03-14 08:32:52.148977265 +0000 UTC m=+307.397428892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 14 08:32:59 crc kubenswrapper[4886]: E0314 08:32:59.507279 4886 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" volumeName="registry-storage" Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.651191 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.652581 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.653931 4886 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578" exitCode=1 Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.653987 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578"} Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.654603 4886 scope.go:117] "RemoveContainer" containerID="43cd795006e2735c82e25d7f0b8ccfae0749e5d80cc96068969329b677904578" Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.655341 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:00 crc kubenswrapper[4886]: I0314 08:33:00.656095 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.420735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.422319 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.423011 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.436525 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.436563 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:01 crc kubenswrapper[4886]: E0314 08:33:01.437275 4886 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.437986 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:01 crc kubenswrapper[4886]: W0314 08:33:01.467783 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-895d2d7562cd3b5b9e2e4543186c5e6543858753cba20f50791204972ddee5ca WatchSource:0}: Error finding container 895d2d7562cd3b5b9e2e4543186c5e6543858753cba20f50791204972ddee5ca: Status 404 returned error can't find the container with id 895d2d7562cd3b5b9e2e4543186c5e6543858753cba20f50791204972ddee5ca Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.664216 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"895d2d7562cd3b5b9e2e4543186c5e6543858753cba20f50791204972ddee5ca"} Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.666452 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.666803 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.666846 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e00729b9b2a045348b731d83351bee9b593ecde7455ab1aac2e0b0cf556632db"} Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.667624 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:01 crc kubenswrapper[4886]: I0314 08:33:01.667954 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:02 crc kubenswrapper[4886]: I0314 08:33:02.682296 4886 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d6781e88000a8766e79b0c5e6b315dee116e939fdc64482d550ab1c3630fead7" exitCode=0 Mar 14 08:33:02 crc kubenswrapper[4886]: I0314 08:33:02.682373 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d6781e88000a8766e79b0c5e6b315dee116e939fdc64482d550ab1c3630fead7"} Mar 14 08:33:02 crc kubenswrapper[4886]: I0314 08:33:02.682726 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:02 crc kubenswrapper[4886]: I0314 08:33:02.682760 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:02 crc kubenswrapper[4886]: I0314 08:33:02.683293 4886 status_manager.go:851] "Failed to get status for pod" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:02 crc kubenswrapper[4886]: E0314 08:33:02.683329 4886 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:02 crc kubenswrapper[4886]: I0314 08:33:02.683913 4886 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.38:6443: connect: connection refused" Mar 14 08:33:03 crc kubenswrapper[4886]: I0314 08:33:03.692259 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22713b4ab6e06f8dc4385ce4677c193aaa98fa63126206cdfc54b22a83e2f369"} Mar 14 08:33:03 crc kubenswrapper[4886]: I0314 08:33:03.692598 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"70b67fcbb765883bb327e59c71aa329be87b8fa64030d5539c33bd985d17c7fb"} Mar 14 08:33:03 crc kubenswrapper[4886]: I0314 08:33:03.692614 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b30040590de046006dfa3f47eec430d90802d76ace7beb439d482159905eb8e0"} Mar 14 08:33:04 crc kubenswrapper[4886]: I0314 08:33:04.482514 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:33:04 crc kubenswrapper[4886]: I0314 08:33:04.701540 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c68e7c858293da4611ba41fe1d3da8be8e06d04c7c2eda5fd605ab440e6f3e9f"} Mar 14 08:33:04 crc kubenswrapper[4886]: I0314 08:33:04.701579 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"37055ded841a263ff55e1befd24e5b198a2562b59eeab6bab02a14826bbfe9f2"} Mar 14 08:33:04 crc kubenswrapper[4886]: I0314 08:33:04.701716 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:04 crc kubenswrapper[4886]: I0314 08:33:04.701837 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:04 crc kubenswrapper[4886]: I0314 08:33:04.701863 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:06 crc kubenswrapper[4886]: I0314 08:33:06.438223 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:06 crc kubenswrapper[4886]: I0314 08:33:06.438302 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:06 crc kubenswrapper[4886]: I0314 08:33:06.449617 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:09 crc kubenswrapper[4886]: I0314 08:33:09.713858 4886 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:09 crc kubenswrapper[4886]: I0314 08:33:09.750227 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:09 crc kubenswrapper[4886]: I0314 08:33:09.750264 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:09 crc kubenswrapper[4886]: I0314 08:33:09.755601 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:09 crc kubenswrapper[4886]: I0314 08:33:09.777940 4886 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="701b9c67-fb13-430c-a11b-17a3fb35faa4" Mar 14 08:33:10 crc kubenswrapper[4886]: I0314 08:33:10.198836 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:33:10 crc kubenswrapper[4886]: I0314 08:33:10.204695 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:33:10 crc kubenswrapper[4886]: I0314 08:33:10.759267 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:10 crc kubenswrapper[4886]: I0314 08:33:10.759611 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:10 crc kubenswrapper[4886]: I0314 08:33:10.763637 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 14 08:33:10 crc kubenswrapper[4886]: I0314 08:33:10.763663 4886 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="701b9c67-fb13-430c-a11b-17a3fb35faa4" Mar 14 08:33:19 crc kubenswrapper[4886]: I0314 08:33:19.439658 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.004446 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.094332 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.214310 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.280477 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.449516 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.626320 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.687885 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.958323 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 14 08:33:20 crc kubenswrapper[4886]: I0314 08:33:20.971463 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:33:21 crc kubenswrapper[4886]: I0314 08:33:21.048114 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 14 08:33:21 crc kubenswrapper[4886]: I0314 08:33:21.218869 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 14 08:33:21 crc kubenswrapper[4886]: I0314 08:33:21.649804 4886 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 14 08:33:21 crc kubenswrapper[4886]: I0314 08:33:21.861593 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 14 08:33:21 crc kubenswrapper[4886]: I0314 08:33:21.896790 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 14 08:33:21 crc kubenswrapper[4886]: I0314 08:33:21.937872 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.261512 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.267047 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.280899 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.441597 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.457332 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.484704 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.495808 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.540732 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.592736 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.927217 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.927692 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.962101 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 14 08:33:22 crc kubenswrapper[4886]: I0314 08:33:22.969024 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.032979 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.157747 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.281759 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.286536 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.489086 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.495642 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.499299 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.706990 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.770005 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.838499 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.857790 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 14 08:33:23 crc kubenswrapper[4886]: I0314 08:33:23.926192 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.037718 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.109022 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.153888 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.214469 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.274737 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.470104 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.472563 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.573951 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.614235 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.679627 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.731528 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.765166 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.766503 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.816994 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.868873 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 14 08:33:24 crc kubenswrapper[4886]: I0314 08:33:24.925167 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.014945 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.026337 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.071018 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.144457 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.147929 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.150881 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.185801 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.203151 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.205639 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.207002 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.259903 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.299060 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.321892 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.378786 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.382240 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.480588 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.516510 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.523003 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.548517 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.629256 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.631195 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.701843 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.790846 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.805528 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.805683 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.819439 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.839353 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.853791 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.877832 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 14 08:33:25 crc kubenswrapper[4886]: I0314 08:33:25.951413 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.067409 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.260188 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.362436 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.440202 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.512025 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.581256 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.664389 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.719293 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.731013 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.750578 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.777678 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.814851 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 14 08:33:26 crc kubenswrapper[4886]: I0314 08:33:26.952884 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.028904 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.029294 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.036647 4886 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.044906 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.199835 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.297812 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.305221 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.306639 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.320149 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.323259 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.393600 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.433020 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.480108 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.550456 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.583844 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.623408 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.649455 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.722341 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.803649 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.853392 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.870817 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.952880 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 14 08:33:27 crc kubenswrapper[4886]: I0314 08:33:27.994991 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.037929 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.077160 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.102621 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.149169 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.161886 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.248901 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.252486 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.305814 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.395221 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.408691 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.427815 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.437876 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.538785 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.605603 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.781524 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.880566 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.913929 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 14 08:33:28 crc kubenswrapper[4886]: I0314 08:33:28.949864 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.021211 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.117099 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.259961 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.325651 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.341178 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.360413 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.370016 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.399372 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.628364 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.654464 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.675598 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.721647 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.770285 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.775989 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.820345 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.851733 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.893074 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 14 08:33:29 crc kubenswrapper[4886]: I0314 08:33:29.989412 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.007502 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.121070 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.138420 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.193948 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.213978 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.285669 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.347823 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.363900 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.378865 4886 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.443640 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.585460 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.636837 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.667602 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.699079 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.780886 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.892029 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.982879 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 14 08:33:30 crc kubenswrapper[4886]: I0314 08:33:30.986466 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.030106 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.107674 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.131562 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.169546 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.245302 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.273584 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.292624 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.301053 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.381788 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.388717 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.439493 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.490974 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.670711 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.685552 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 14 08:33:31 crc kubenswrapper[4886]: I0314 08:33:31.988751 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.045440 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.051472 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.058082 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.076354 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.193081 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.293715 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.414319 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.435044 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.443753 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.481018 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.524800 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.577649 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.583550 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.683953 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.807955 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 14 08:33:32 crc kubenswrapper[4886]: I0314 08:33:32.896257 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.102217 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.108238 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.208237 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.276872 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.312107 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.364872 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.395370 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.413276 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.639175 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.831075 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 14 08:33:33 crc kubenswrapper[4886]: I0314 08:33:33.843219 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.014256 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.015832 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.017645 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.045661 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.070938 4886 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.177849 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.237345 4886 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.427630 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.740436 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 14 08:33:34 crc kubenswrapper[4886]: I0314 08:33:34.992170 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 14 08:33:35 crc kubenswrapper[4886]: I0314 08:33:35.062569 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 14 08:33:35 crc kubenswrapper[4886]: I0314 08:33:35.192480 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 14 08:33:35 crc kubenswrapper[4886]: I0314 08:33:35.381826 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 14 08:33:53 crc kubenswrapper[4886]: I0314 08:33:53.382774 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 08:33:55 crc kubenswrapper[4886]: I0314 08:33:55.565026 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.057697 4886 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.063667 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.063714 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-66f8689f66-t5n86"] Mar 14 08:33:58 crc kubenswrapper[4886]: E0314 08:33:58.063952 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" containerName="installer" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.063969 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" containerName="installer" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064085 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85c6296-9423-4f0a-b5cd-d265dd437a77" containerName="installer" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064575 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd48867-cthq8","openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48"] Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064623 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064700 4886 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064771 4886 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e02166ba-9748-4add-8980-e4d799092d18" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064790 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" podUID="d13d372b-e10b-4bf8-bf54-99836e06cc85" containerName="controller-manager" containerID="cri-o://5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e" gracePeriod=30 Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.064943 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" podUID="07c02e5d-5af8-43a0-8485-d2c5f2344525" containerName="route-controller-manager" containerID="cri-o://86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4" gracePeriod=30 Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.070802 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.071016 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.070929 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.071264 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.072250 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.077355 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.077708 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.077897 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.077718 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.077784 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.080573 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.080616 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.080706 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.083594 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.093492 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.099568 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.112478 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=49.112457779 podStartE2EDuration="49.112457779s" podCreationTimestamp="2026-03-14 08:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:33:58.111006077 +0000 UTC m=+373.359457714" watchObservedRunningTime="2026-03-14 08:33:58.112457779 +0000 UTC m=+373.360909416" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.139572 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.139708 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.139741 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.139944 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140030 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64100150-0aa0-470f-b9ba-43bee08023ca-audit-dir\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140085 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140109 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140183 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqj8\" (UniqueName: \"kubernetes.io/projected/64100150-0aa0-470f-b9ba-43bee08023ca-kube-api-access-djqj8\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140209 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140280 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-audit-policies\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140297 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140350 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140369 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.140415 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241262 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241324 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqj8\" (UniqueName: \"kubernetes.io/projected/64100150-0aa0-470f-b9ba-43bee08023ca-kube-api-access-djqj8\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241374 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241398 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-audit-policies\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241413 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241431 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241452 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241470 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241501 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241523 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241541 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241569 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241604 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64100150-0aa0-470f-b9ba-43bee08023ca-audit-dir\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.241673 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64100150-0aa0-470f-b9ba-43bee08023ca-audit-dir\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.243325 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-audit-policies\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.243374 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.244074 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.244569 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.252208 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.252431 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.252623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.253645 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.257733 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.264417 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.268302 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.272151 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqj8\" (UniqueName: \"kubernetes.io/projected/64100150-0aa0-470f-b9ba-43bee08023ca-kube-api-access-djqj8\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.295654 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/64100150-0aa0-470f-b9ba-43bee08023ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-t5n86\" (UID: \"64100150-0aa0-470f-b9ba-43bee08023ca\") " pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.390234 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.480780 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.521603 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fdc9849d6-df7m9"] Mar 14 08:33:58 crc kubenswrapper[4886]: E0314 08:33:58.521854 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13d372b-e10b-4bf8-bf54-99836e06cc85" containerName="controller-manager" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.521869 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13d372b-e10b-4bf8-bf54-99836e06cc85" containerName="controller-manager" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.522013 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13d372b-e10b-4bf8-bf54-99836e06cc85" containerName="controller-manager" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.522455 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.526716 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fdc9849d6-df7m9"] Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.545014 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-client-ca\") pod \"d13d372b-e10b-4bf8-bf54-99836e06cc85\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.545077 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-config\") pod \"d13d372b-e10b-4bf8-bf54-99836e06cc85\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.545189 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmg2r\" (UniqueName: \"kubernetes.io/projected/d13d372b-e10b-4bf8-bf54-99836e06cc85-kube-api-access-bmg2r\") pod \"d13d372b-e10b-4bf8-bf54-99836e06cc85\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.545976 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-client-ca" (OuterVolumeSpecName: "client-ca") pod "d13d372b-e10b-4bf8-bf54-99836e06cc85" (UID: "d13d372b-e10b-4bf8-bf54-99836e06cc85"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.546022 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-proxy-ca-bundles\") pod \"d13d372b-e10b-4bf8-bf54-99836e06cc85\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.546061 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13d372b-e10b-4bf8-bf54-99836e06cc85-serving-cert\") pod \"d13d372b-e10b-4bf8-bf54-99836e06cc85\" (UID: \"d13d372b-e10b-4bf8-bf54-99836e06cc85\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.546481 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d13d372b-e10b-4bf8-bf54-99836e06cc85" (UID: "d13d372b-e10b-4bf8-bf54-99836e06cc85"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.546555 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.546573 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.546613 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-config" (OuterVolumeSpecName: "config") pod "d13d372b-e10b-4bf8-bf54-99836e06cc85" (UID: "d13d372b-e10b-4bf8-bf54-99836e06cc85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.549978 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13d372b-e10b-4bf8-bf54-99836e06cc85-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d13d372b-e10b-4bf8-bf54-99836e06cc85" (UID: "d13d372b-e10b-4bf8-bf54-99836e06cc85"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.551471 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13d372b-e10b-4bf8-bf54-99836e06cc85-kube-api-access-bmg2r" (OuterVolumeSpecName: "kube-api-access-bmg2r") pod "d13d372b-e10b-4bf8-bf54-99836e06cc85" (UID: "d13d372b-e10b-4bf8-bf54-99836e06cc85"). InnerVolumeSpecName "kube-api-access-bmg2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.554497 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.646861 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c02e5d-5af8-43a0-8485-d2c5f2344525-serving-cert\") pod \"07c02e5d-5af8-43a0-8485-d2c5f2344525\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.646934 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhflf\" (UniqueName: \"kubernetes.io/projected/07c02e5d-5af8-43a0-8485-d2c5f2344525-kube-api-access-nhflf\") pod \"07c02e5d-5af8-43a0-8485-d2c5f2344525\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.646993 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-client-ca\") pod \"07c02e5d-5af8-43a0-8485-d2c5f2344525\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647014 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-config\") pod \"07c02e5d-5af8-43a0-8485-d2c5f2344525\" (UID: \"07c02e5d-5af8-43a0-8485-d2c5f2344525\") " Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647219 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-client-ca\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647276 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jslzm\" (UniqueName: \"kubernetes.io/projected/3d3c670e-9026-4f83-8bfc-1c42774c7851-kube-api-access-jslzm\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647321 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-config\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647370 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-proxy-ca-bundles\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647397 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3c670e-9026-4f83-8bfc-1c42774c7851-serving-cert\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647447 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d13d372b-e10b-4bf8-bf54-99836e06cc85-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647462 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmg2r\" (UniqueName: \"kubernetes.io/projected/d13d372b-e10b-4bf8-bf54-99836e06cc85-kube-api-access-bmg2r\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.647475 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d13d372b-e10b-4bf8-bf54-99836e06cc85-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.655013 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c02e5d-5af8-43a0-8485-d2c5f2344525-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07c02e5d-5af8-43a0-8485-d2c5f2344525" (UID: "07c02e5d-5af8-43a0-8485-d2c5f2344525"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.656322 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-config" (OuterVolumeSpecName: "config") pod "07c02e5d-5af8-43a0-8485-d2c5f2344525" (UID: "07c02e5d-5af8-43a0-8485-d2c5f2344525"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.656576 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-client-ca" (OuterVolumeSpecName: "client-ca") pod "07c02e5d-5af8-43a0-8485-d2c5f2344525" (UID: "07c02e5d-5af8-43a0-8485-d2c5f2344525"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.658200 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c02e5d-5af8-43a0-8485-d2c5f2344525-kube-api-access-nhflf" (OuterVolumeSpecName: "kube-api-access-nhflf") pod "07c02e5d-5af8-43a0-8485-d2c5f2344525" (UID: "07c02e5d-5af8-43a0-8485-d2c5f2344525"). InnerVolumeSpecName "kube-api-access-nhflf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748264 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-client-ca\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748531 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jslzm\" (UniqueName: \"kubernetes.io/projected/3d3c670e-9026-4f83-8bfc-1c42774c7851-kube-api-access-jslzm\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748567 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-config\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748603 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-proxy-ca-bundles\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748627 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3c670e-9026-4f83-8bfc-1c42774c7851-serving-cert\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748660 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748671 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c02e5d-5af8-43a0-8485-d2c5f2344525-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748681 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c02e5d-5af8-43a0-8485-d2c5f2344525-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.748691 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhflf\" (UniqueName: \"kubernetes.io/projected/07c02e5d-5af8-43a0-8485-d2c5f2344525-kube-api-access-nhflf\") on node \"crc\" DevicePath \"\"" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.749094 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-client-ca\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.752198 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-proxy-ca-bundles\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.758626 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-config\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.765063 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3c670e-9026-4f83-8bfc-1c42774c7851-serving-cert\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.768450 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jslzm\" (UniqueName: \"kubernetes.io/projected/3d3c670e-9026-4f83-8bfc-1c42774c7851-kube-api-access-jslzm\") pod \"controller-manager-7fdc9849d6-df7m9\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.845901 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:33:58 crc kubenswrapper[4886]: I0314 08:33:58.899299 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-t5n86"] Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.033300 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" event={"ID":"64100150-0aa0-470f-b9ba-43bee08023ca","Type":"ContainerStarted","Data":"07934e2dbc96e9a2a9255448d51f5b9666d871915c8bf13b266c1fc401af6940"} Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.037405 4886 generic.go:334] "Generic (PLEG): container finished" podID="d13d372b-e10b-4bf8-bf54-99836e06cc85" containerID="5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e" exitCode=0 Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.037451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" event={"ID":"d13d372b-e10b-4bf8-bf54-99836e06cc85","Type":"ContainerDied","Data":"5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e"} Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.037471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" event={"ID":"d13d372b-e10b-4bf8-bf54-99836e06cc85","Type":"ContainerDied","Data":"e65f423afb73803c548464bf241167224d538e2943a90d8800230842bc7e3f4a"} Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.037487 4886 scope.go:117] "RemoveContainer" containerID="5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.037599 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dcd48867-cthq8" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.042399 4886 generic.go:334] "Generic (PLEG): container finished" podID="07c02e5d-5af8-43a0-8485-d2c5f2344525" containerID="86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4" exitCode=0 Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.042465 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.042493 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" event={"ID":"07c02e5d-5af8-43a0-8485-d2c5f2344525","Type":"ContainerDied","Data":"86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4"} Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.042538 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48" event={"ID":"07c02e5d-5af8-43a0-8485-d2c5f2344525","Type":"ContainerDied","Data":"c523b126af487cfb6d6edd32f887b2f54282710d37930b111a88cfe257158bcb"} Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.077200 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fdc9849d6-df7m9"] Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.084302 4886 scope.go:117] "RemoveContainer" containerID="5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e" Mar 14 08:33:59 crc kubenswrapper[4886]: E0314 08:33:59.085238 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e\": container with ID starting with 5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e not found: ID does not exist" containerID="5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.085283 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e"} err="failed to get container status \"5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e\": rpc error: code = NotFound desc = could not find container \"5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e\": container with ID starting with 5a7c246c7637779b5b7898a6da26f302a6fe3f8d8b1f8db28dda011178fe9a4e not found: ID does not exist" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.085314 4886 scope.go:117] "RemoveContainer" containerID="86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.086894 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd48867-cthq8"] Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.091939 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dcd48867-cthq8"] Mar 14 08:33:59 crc kubenswrapper[4886]: W0314 08:33:59.095555 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3c670e_9026_4f83_8bfc_1c42774c7851.slice/crio-0f6207e24402d7b06426489629bff7f9795dbbcef54778b17b9a8fffa343ac0e WatchSource:0}: Error finding container 0f6207e24402d7b06426489629bff7f9795dbbcef54778b17b9a8fffa343ac0e: Status 404 returned error can't find the container with id 0f6207e24402d7b06426489629bff7f9795dbbcef54778b17b9a8fffa343ac0e Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.095627 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48"] Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.098948 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d86c6ff7-kbd48"] Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.106417 4886 scope.go:117] "RemoveContainer" containerID="86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4" Mar 14 08:33:59 crc kubenswrapper[4886]: E0314 08:33:59.108266 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4\": container with ID starting with 86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4 not found: ID does not exist" containerID="86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.108354 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4"} err="failed to get container status \"86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4\": rpc error: code = NotFound desc = could not find container \"86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4\": container with ID starting with 86bb3a8ef83443128029b16b69816c6f0070793d097f44e956ef92b1230e88f4 not found: ID does not exist" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.427264 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c02e5d-5af8-43a0-8485-d2c5f2344525" path="/var/lib/kubelet/pods/07c02e5d-5af8-43a0-8485-d2c5f2344525/volumes" Mar 14 08:33:59 crc kubenswrapper[4886]: I0314 08:33:59.427779 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13d372b-e10b-4bf8-bf54-99836e06cc85" path="/var/lib/kubelet/pods/d13d372b-e10b-4bf8-bf54-99836e06cc85/volumes" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.050682 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" event={"ID":"3d3c670e-9026-4f83-8bfc-1c42774c7851","Type":"ContainerStarted","Data":"3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2"} Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.050999 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" event={"ID":"3d3c670e-9026-4f83-8bfc-1c42774c7851","Type":"ContainerStarted","Data":"0f6207e24402d7b06426489629bff7f9795dbbcef54778b17b9a8fffa343ac0e"} Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.051147 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.052391 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" event={"ID":"64100150-0aa0-470f-b9ba-43bee08023ca","Type":"ContainerStarted","Data":"69bc4125afe0d736588955fc8da6db969cb2654106485287af82afb358f5248a"} Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.055469 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.097740 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" podStartSLOduration=13.097716925 podStartE2EDuration="13.097716925s" podCreationTimestamp="2026-03-14 08:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:34:00.073498958 +0000 UTC m=+375.321950605" watchObservedRunningTime="2026-03-14 08:34:00.097716925 +0000 UTC m=+375.346168572" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.128995 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" podStartSLOduration=101.128978625 podStartE2EDuration="1m41.128978625s" podCreationTimestamp="2026-03-14 08:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:34:00.126934356 +0000 UTC m=+375.375385993" watchObservedRunningTime="2026-03-14 08:34:00.128978625 +0000 UTC m=+375.377430272" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.191788 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557954-jq2zf"] Mar 14 08:34:00 crc kubenswrapper[4886]: E0314 08:34:00.192006 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c02e5d-5af8-43a0-8485-d2c5f2344525" containerName="route-controller-manager" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.192020 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c02e5d-5af8-43a0-8485-d2c5f2344525" containerName="route-controller-manager" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.192113 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c02e5d-5af8-43a0-8485-d2c5f2344525" containerName="route-controller-manager" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.192485 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.194402 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.194490 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.195087 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.205644 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-jq2zf"] Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.269572 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxcb\" (UniqueName: \"kubernetes.io/projected/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8-kube-api-access-sbxcb\") pod \"auto-csr-approver-29557954-jq2zf\" (UID: \"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8\") " pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.371888 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxcb\" (UniqueName: \"kubernetes.io/projected/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8-kube-api-access-sbxcb\") pod \"auto-csr-approver-29557954-jq2zf\" (UID: \"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8\") " pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.404625 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxcb\" (UniqueName: \"kubernetes.io/projected/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8-kube-api-access-sbxcb\") pod \"auto-csr-approver-29557954-jq2zf\" (UID: \"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8\") " pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.516025 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.707962 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-jq2zf"] Mar 14 08:34:00 crc kubenswrapper[4886]: W0314 08:34:00.713766 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f2ea03d_a434_4da8_a1e1_18532cb7e0e8.slice/crio-fa846d8406709fa4fa53d9d69a7ce9f8c1a7032a146905c99156982357eaac6c WatchSource:0}: Error finding container fa846d8406709fa4fa53d9d69a7ce9f8c1a7032a146905c99156982357eaac6c: Status 404 returned error can't find the container with id fa846d8406709fa4fa53d9d69a7ce9f8c1a7032a146905c99156982357eaac6c Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.824840 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn"] Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.825776 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.827900 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.828226 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.828250 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.828298 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.828411 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.829708 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.832942 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn"] Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.979518 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-client-ca\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.979590 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-config\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.979628 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52ef211-3ee8-4efd-9039-1a1383aaa347-serving-cert\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:00 crc kubenswrapper[4886]: I0314 08:34:00.979680 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngcx9\" (UniqueName: \"kubernetes.io/projected/a52ef211-3ee8-4efd-9039-1a1383aaa347-kube-api-access-ngcx9\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.060521 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" event={"ID":"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8","Type":"ContainerStarted","Data":"fa846d8406709fa4fa53d9d69a7ce9f8c1a7032a146905c99156982357eaac6c"} Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.061403 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.069006 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66f8689f66-t5n86" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.081327 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-client-ca\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.081410 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-config\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.081465 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52ef211-3ee8-4efd-9039-1a1383aaa347-serving-cert\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.081541 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngcx9\" (UniqueName: \"kubernetes.io/projected/a52ef211-3ee8-4efd-9039-1a1383aaa347-kube-api-access-ngcx9\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.082845 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-config\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.084527 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-client-ca\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.096064 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52ef211-3ee8-4efd-9039-1a1383aaa347-serving-cert\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.101557 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngcx9\" (UniqueName: \"kubernetes.io/projected/a52ef211-3ee8-4efd-9039-1a1383aaa347-kube-api-access-ngcx9\") pod \"route-controller-manager-cf466c567-lfqfn\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.146652 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:01 crc kubenswrapper[4886]: I0314 08:34:01.634099 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn"] Mar 14 08:34:01 crc kubenswrapper[4886]: W0314 08:34:01.654017 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52ef211_3ee8_4efd_9039_1a1383aaa347.slice/crio-95e5ab52ad627bedd40574afeedcb1573a66eba1ab3e8cbaee2bb64fbb88aa1c WatchSource:0}: Error finding container 95e5ab52ad627bedd40574afeedcb1573a66eba1ab3e8cbaee2bb64fbb88aa1c: Status 404 returned error can't find the container with id 95e5ab52ad627bedd40574afeedcb1573a66eba1ab3e8cbaee2bb64fbb88aa1c Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.073214 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" event={"ID":"a52ef211-3ee8-4efd-9039-1a1383aaa347","Type":"ContainerStarted","Data":"2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852"} Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.073266 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" event={"ID":"a52ef211-3ee8-4efd-9039-1a1383aaa347","Type":"ContainerStarted","Data":"95e5ab52ad627bedd40574afeedcb1573a66eba1ab3e8cbaee2bb64fbb88aa1c"} Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.073442 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.075548 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" event={"ID":"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8","Type":"ContainerStarted","Data":"60690f219ea99479fa8770d1fb6d2809f83423dbceee4a64502919be7a6ba2fd"} Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.092645 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" podStartSLOduration=15.092628428 podStartE2EDuration="15.092628428s" podCreationTimestamp="2026-03-14 08:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:34:02.092113243 +0000 UTC m=+377.340564890" watchObservedRunningTime="2026-03-14 08:34:02.092628428 +0000 UTC m=+377.341080065" Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.109099 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" podStartSLOduration=1.34896455 podStartE2EDuration="2.109084692s" podCreationTimestamp="2026-03-14 08:34:00 +0000 UTC" firstStartedPulling="2026-03-14 08:34:00.715020568 +0000 UTC m=+375.963472195" lastFinishedPulling="2026-03-14 08:34:01.47514069 +0000 UTC m=+376.723592337" observedRunningTime="2026-03-14 08:34:02.105862319 +0000 UTC m=+377.354313966" watchObservedRunningTime="2026-03-14 08:34:02.109084692 +0000 UTC m=+377.357536329" Mar 14 08:34:02 crc kubenswrapper[4886]: I0314 08:34:02.287522 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:03 crc kubenswrapper[4886]: I0314 08:34:03.091598 4886 generic.go:334] "Generic (PLEG): container finished" podID="4f2ea03d-a434-4da8-a1e1-18532cb7e0e8" containerID="60690f219ea99479fa8770d1fb6d2809f83423dbceee4a64502919be7a6ba2fd" exitCode=0 Mar 14 08:34:03 crc kubenswrapper[4886]: I0314 08:34:03.092778 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" event={"ID":"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8","Type":"ContainerDied","Data":"60690f219ea99479fa8770d1fb6d2809f83423dbceee4a64502919be7a6ba2fd"} Mar 14 08:34:04 crc kubenswrapper[4886]: I0314 08:34:04.173521 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 14 08:34:04 crc kubenswrapper[4886]: I0314 08:34:04.478312 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:04 crc kubenswrapper[4886]: I0314 08:34:04.529979 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbxcb\" (UniqueName: \"kubernetes.io/projected/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8-kube-api-access-sbxcb\") pod \"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8\" (UID: \"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8\") " Mar 14 08:34:04 crc kubenswrapper[4886]: I0314 08:34:04.539426 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8-kube-api-access-sbxcb" (OuterVolumeSpecName: "kube-api-access-sbxcb") pod "4f2ea03d-a434-4da8-a1e1-18532cb7e0e8" (UID: "4f2ea03d-a434-4da8-a1e1-18532cb7e0e8"). InnerVolumeSpecName "kube-api-access-sbxcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:04 crc kubenswrapper[4886]: I0314 08:34:04.632491 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbxcb\" (UniqueName: \"kubernetes.io/projected/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8-kube-api-access-sbxcb\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:05 crc kubenswrapper[4886]: I0314 08:34:05.107064 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" event={"ID":"4f2ea03d-a434-4da8-a1e1-18532cb7e0e8","Type":"ContainerDied","Data":"fa846d8406709fa4fa53d9d69a7ce9f8c1a7032a146905c99156982357eaac6c"} Mar 14 08:34:05 crc kubenswrapper[4886]: I0314 08:34:05.107107 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa846d8406709fa4fa53d9d69a7ce9f8c1a7032a146905c99156982357eaac6c" Mar 14 08:34:05 crc kubenswrapper[4886]: I0314 08:34:05.107111 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557954-jq2zf" Mar 14 08:34:06 crc kubenswrapper[4886]: I0314 08:34:06.053512 4886 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 14 08:34:06 crc kubenswrapper[4886]: I0314 08:34:06.053719 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://eb02a0483f76592e5a2c186501e140eac223a82358c926d89dd8246eaef79d49" gracePeriod=5 Mar 14 08:34:06 crc kubenswrapper[4886]: I0314 08:34:06.487401 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.138376 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.138734 4886 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="eb02a0483f76592e5a2c186501e140eac223a82358c926d89dd8246eaef79d49" exitCode=137 Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.621242 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.621702 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729548 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729628 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729680 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729729 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729761 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729824 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729827 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729851 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.729913 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.730379 4886 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.730421 4886 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.730440 4886 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.730457 4886 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.755491 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:34:11 crc kubenswrapper[4886]: I0314 08:34:11.832142 4886 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:12 crc kubenswrapper[4886]: I0314 08:34:12.145790 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 14 08:34:12 crc kubenswrapper[4886]: I0314 08:34:12.145861 4886 scope.go:117] "RemoveContainer" containerID="eb02a0483f76592e5a2c186501e140eac223a82358c926d89dd8246eaef79d49" Mar 14 08:34:12 crc kubenswrapper[4886]: I0314 08:34:12.146034 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 14 08:34:13 crc kubenswrapper[4886]: I0314 08:34:13.426141 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 14 08:34:26 crc kubenswrapper[4886]: I0314 08:34:26.066742 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:34:26 crc kubenswrapper[4886]: I0314 08:34:26.067457 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.249457 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn"] Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.250045 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" podUID="a52ef211-3ee8-4efd-9039-1a1383aaa347" containerName="route-controller-manager" containerID="cri-o://2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852" gracePeriod=30 Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.746736 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.832619 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-config\") pod \"a52ef211-3ee8-4efd-9039-1a1383aaa347\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.832705 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52ef211-3ee8-4efd-9039-1a1383aaa347-serving-cert\") pod \"a52ef211-3ee8-4efd-9039-1a1383aaa347\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.832733 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-client-ca\") pod \"a52ef211-3ee8-4efd-9039-1a1383aaa347\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.832812 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngcx9\" (UniqueName: \"kubernetes.io/projected/a52ef211-3ee8-4efd-9039-1a1383aaa347-kube-api-access-ngcx9\") pod \"a52ef211-3ee8-4efd-9039-1a1383aaa347\" (UID: \"a52ef211-3ee8-4efd-9039-1a1383aaa347\") " Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.833449 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-client-ca" (OuterVolumeSpecName: "client-ca") pod "a52ef211-3ee8-4efd-9039-1a1383aaa347" (UID: "a52ef211-3ee8-4efd-9039-1a1383aaa347"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.833571 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-config" (OuterVolumeSpecName: "config") pod "a52ef211-3ee8-4efd-9039-1a1383aaa347" (UID: "a52ef211-3ee8-4efd-9039-1a1383aaa347"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.843246 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52ef211-3ee8-4efd-9039-1a1383aaa347-kube-api-access-ngcx9" (OuterVolumeSpecName: "kube-api-access-ngcx9") pod "a52ef211-3ee8-4efd-9039-1a1383aaa347" (UID: "a52ef211-3ee8-4efd-9039-1a1383aaa347"). InnerVolumeSpecName "kube-api-access-ngcx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.843275 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52ef211-3ee8-4efd-9039-1a1383aaa347-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a52ef211-3ee8-4efd-9039-1a1383aaa347" (UID: "a52ef211-3ee8-4efd-9039-1a1383aaa347"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.935373 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a52ef211-3ee8-4efd-9039-1a1383aaa347-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.935403 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.935413 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngcx9\" (UniqueName: \"kubernetes.io/projected/a52ef211-3ee8-4efd-9039-1a1383aaa347-kube-api-access-ngcx9\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:27 crc kubenswrapper[4886]: I0314 08:34:27.935425 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a52ef211-3ee8-4efd-9039-1a1383aaa347-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.234044 4886 generic.go:334] "Generic (PLEG): container finished" podID="a52ef211-3ee8-4efd-9039-1a1383aaa347" containerID="2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852" exitCode=0 Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.234101 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.234145 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" event={"ID":"a52ef211-3ee8-4efd-9039-1a1383aaa347","Type":"ContainerDied","Data":"2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852"} Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.234204 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn" event={"ID":"a52ef211-3ee8-4efd-9039-1a1383aaa347","Type":"ContainerDied","Data":"95e5ab52ad627bedd40574afeedcb1573a66eba1ab3e8cbaee2bb64fbb88aa1c"} Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.234234 4886 scope.go:117] "RemoveContainer" containerID="2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.272577 4886 scope.go:117] "RemoveContainer" containerID="2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852" Mar 14 08:34:28 crc kubenswrapper[4886]: E0314 08:34:28.273097 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852\": container with ID starting with 2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852 not found: ID does not exist" containerID="2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.273174 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852"} err="failed to get container status \"2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852\": rpc error: code = NotFound desc = could not find container \"2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852\": container with ID starting with 2b3eaa67965abada36e1892143d62b64f8b391ccc800dab31d9847ddeab9d852 not found: ID does not exist" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.274268 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn"] Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.277339 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cf466c567-lfqfn"] Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.840834 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf"] Mar 14 08:34:28 crc kubenswrapper[4886]: E0314 08:34:28.841082 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841097 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 08:34:28 crc kubenswrapper[4886]: E0314 08:34:28.841139 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52ef211-3ee8-4efd-9039-1a1383aaa347" containerName="route-controller-manager" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841153 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52ef211-3ee8-4efd-9039-1a1383aaa347" containerName="route-controller-manager" Mar 14 08:34:28 crc kubenswrapper[4886]: E0314 08:34:28.841170 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2ea03d-a434-4da8-a1e1-18532cb7e0e8" containerName="oc" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841179 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2ea03d-a434-4da8-a1e1-18532cb7e0e8" containerName="oc" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841301 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2ea03d-a434-4da8-a1e1-18532cb7e0e8" containerName="oc" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841317 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841329 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52ef211-3ee8-4efd-9039-1a1383aaa347" containerName="route-controller-manager" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.841847 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.843459 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.844660 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.844845 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.845066 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.845275 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.845993 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.853728 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf"] Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.947069 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-client-ca\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.947265 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfj5\" (UniqueName: \"kubernetes.io/projected/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-kube-api-access-7bfj5\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.947341 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-config\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:28 crc kubenswrapper[4886]: I0314 08:34:28.947482 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-serving-cert\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.048345 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-serving-cert\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.048433 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-client-ca\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.048463 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bfj5\" (UniqueName: \"kubernetes.io/projected/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-kube-api-access-7bfj5\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.048493 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-config\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.050261 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-config\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.055100 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-serving-cert\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.058223 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-client-ca\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.075073 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bfj5\" (UniqueName: \"kubernetes.io/projected/a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9-kube-api-access-7bfj5\") pod \"route-controller-manager-6c6cc64466-t4qvf\" (UID: \"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9\") " pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.158424 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.427501 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52ef211-3ee8-4efd-9039-1a1383aaa347" path="/var/lib/kubelet/pods/a52ef211-3ee8-4efd-9039-1a1383aaa347/volumes" Mar 14 08:34:29 crc kubenswrapper[4886]: I0314 08:34:29.576045 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf"] Mar 14 08:34:30 crc kubenswrapper[4886]: I0314 08:34:30.254033 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" event={"ID":"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9","Type":"ContainerStarted","Data":"c02c24b5d11905c7f30b6d326a3b6a546cb538046b32d023a733dd9c25ad6dfb"} Mar 14 08:34:30 crc kubenswrapper[4886]: I0314 08:34:30.254624 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:30 crc kubenswrapper[4886]: I0314 08:34:30.254646 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" event={"ID":"a5a5de19-1231-4ba4-9c2a-20d7f7bb3fd9","Type":"ContainerStarted","Data":"042b1b4b251e7f68b8360841262f1bbeb65d933847687cb48f1eeca980f20b31"} Mar 14 08:34:30 crc kubenswrapper[4886]: I0314 08:34:30.260342 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" Mar 14 08:34:30 crc kubenswrapper[4886]: I0314 08:34:30.288075 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c6cc64466-t4qvf" podStartSLOduration=3.288063026 podStartE2EDuration="3.288063026s" podCreationTimestamp="2026-03-14 08:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:34:30.286554483 +0000 UTC m=+405.535006120" watchObservedRunningTime="2026-03-14 08:34:30.288063026 +0000 UTC m=+405.536514663" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.065868 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.066448 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.135040 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-brtwn"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.136138 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-brtwn" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="registry-server" containerID="cri-o://341885f5e4d6c9bab49cc8788b107f1e04e61ba85e9472c750fb436d18d008df" gracePeriod=30 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.147001 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dk6zm"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.147510 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dk6zm" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="registry-server" containerID="cri-o://f7bb36dbc69b0696038358ed5757d50ba889fb73cea54fe439d610b9a75facd5" gracePeriod=30 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.155938 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8lxpz"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.156256 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerName="marketplace-operator" containerID="cri-o://d94ba0cb504c18a1e91df4ef4675249b415e72659ca353fcbb769af633f56b26" gracePeriod=30 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.166841 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpr95"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.167100 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fpr95" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="registry-server" containerID="cri-o://093f7029785a049be2197520ad1eefec886f54abe36d623dbac5486932c2c74b" gracePeriod=30 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.183167 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v6gf2"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.185936 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.187721 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gfbq"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.187910 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6gfbq" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="registry-server" containerID="cri-o://0c7b8ed5b1eae85061215c343c2815e0779f4d51b5129ff92afe3aaa3552bc5c" gracePeriod=30 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.231144 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcfr5\" (UniqueName: \"kubernetes.io/projected/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-kube-api-access-vcfr5\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.231205 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.231228 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.236981 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v6gf2"] Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.331596 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcfr5\" (UniqueName: \"kubernetes.io/projected/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-kube-api-access-vcfr5\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.331643 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.331670 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.333598 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.341050 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.352723 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcfr5\" (UniqueName: \"kubernetes.io/projected/f0bdc4aa-1cef-4951-9c86-47f00e9bc18b-kube-api-access-vcfr5\") pod \"marketplace-operator-79b997595-v6gf2\" (UID: \"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b\") " pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.420628 4886 generic.go:334] "Generic (PLEG): container finished" podID="98f142ad-f9c5-41ee-81ec-632938796964" containerID="093f7029785a049be2197520ad1eefec886f54abe36d623dbac5486932c2c74b" exitCode=0 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.420695 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpr95" event={"ID":"98f142ad-f9c5-41ee-81ec-632938796964","Type":"ContainerDied","Data":"093f7029785a049be2197520ad1eefec886f54abe36d623dbac5486932c2c74b"} Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.427359 4886 generic.go:334] "Generic (PLEG): container finished" podID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerID="0c7b8ed5b1eae85061215c343c2815e0779f4d51b5129ff92afe3aaa3552bc5c" exitCode=0 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.427421 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfbq" event={"ID":"7c355048-396b-4f00-8ff6-1ffff1d9d62c","Type":"ContainerDied","Data":"0c7b8ed5b1eae85061215c343c2815e0779f4d51b5129ff92afe3aaa3552bc5c"} Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.452507 4886 generic.go:334] "Generic (PLEG): container finished" podID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerID="f7bb36dbc69b0696038358ed5757d50ba889fb73cea54fe439d610b9a75facd5" exitCode=0 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.452633 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerDied","Data":"f7bb36dbc69b0696038358ed5757d50ba889fb73cea54fe439d610b9a75facd5"} Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.494217 4886 generic.go:334] "Generic (PLEG): container finished" podID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerID="341885f5e4d6c9bab49cc8788b107f1e04e61ba85e9472c750fb436d18d008df" exitCode=0 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.494349 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brtwn" event={"ID":"aeaec6eb-91cb-4e68-807f-994b4e9df360","Type":"ContainerDied","Data":"341885f5e4d6c9bab49cc8788b107f1e04e61ba85e9472c750fb436d18d008df"} Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.497096 4886 generic.go:334] "Generic (PLEG): container finished" podID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerID="d94ba0cb504c18a1e91df4ef4675249b415e72659ca353fcbb769af633f56b26" exitCode=0 Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.497177 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" event={"ID":"ea7323d6-f41b-4251-ae88-aa34a5714182","Type":"ContainerDied","Data":"d94ba0cb504c18a1e91df4ef4675249b415e72659ca353fcbb769af633f56b26"} Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.614096 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.648356 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.716718 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.734747 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrk9\" (UniqueName: \"kubernetes.io/projected/aeaec6eb-91cb-4e68-807f-994b4e9df360-kube-api-access-kfrk9\") pod \"aeaec6eb-91cb-4e68-807f-994b4e9df360\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.734805 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j99w9\" (UniqueName: \"kubernetes.io/projected/333059fe-3e95-4e08-b70e-d7d95e1ed279-kube-api-access-j99w9\") pod \"333059fe-3e95-4e08-b70e-d7d95e1ed279\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.734831 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-catalog-content\") pod \"aeaec6eb-91cb-4e68-807f-994b4e9df360\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.734906 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-utilities\") pod \"333059fe-3e95-4e08-b70e-d7d95e1ed279\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.734973 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-utilities\") pod \"aeaec6eb-91cb-4e68-807f-994b4e9df360\" (UID: \"aeaec6eb-91cb-4e68-807f-994b4e9df360\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.734994 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-catalog-content\") pod \"333059fe-3e95-4e08-b70e-d7d95e1ed279\" (UID: \"333059fe-3e95-4e08-b70e-d7d95e1ed279\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.735843 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-utilities" (OuterVolumeSpecName: "utilities") pod "aeaec6eb-91cb-4e68-807f-994b4e9df360" (UID: "aeaec6eb-91cb-4e68-807f-994b4e9df360"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.735935 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-utilities" (OuterVolumeSpecName: "utilities") pod "333059fe-3e95-4e08-b70e-d7d95e1ed279" (UID: "333059fe-3e95-4e08-b70e-d7d95e1ed279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.743824 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeaec6eb-91cb-4e68-807f-994b4e9df360-kube-api-access-kfrk9" (OuterVolumeSpecName: "kube-api-access-kfrk9") pod "aeaec6eb-91cb-4e68-807f-994b4e9df360" (UID: "aeaec6eb-91cb-4e68-807f-994b4e9df360"). InnerVolumeSpecName "kube-api-access-kfrk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.750983 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333059fe-3e95-4e08-b70e-d7d95e1ed279-kube-api-access-j99w9" (OuterVolumeSpecName: "kube-api-access-j99w9") pod "333059fe-3e95-4e08-b70e-d7d95e1ed279" (UID: "333059fe-3e95-4e08-b70e-d7d95e1ed279"). InnerVolumeSpecName "kube-api-access-j99w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.767645 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.778993 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.823715 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835475 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-utilities\") pod \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835524 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-catalog-content\") pod \"98f142ad-f9c5-41ee-81ec-632938796964\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835552 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-operator-metrics\") pod \"ea7323d6-f41b-4251-ae88-aa34a5714182\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835596 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7d2\" (UniqueName: \"kubernetes.io/projected/98f142ad-f9c5-41ee-81ec-632938796964-kube-api-access-rb7d2\") pod \"98f142ad-f9c5-41ee-81ec-632938796964\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835686 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85wdd\" (UniqueName: \"kubernetes.io/projected/ea7323d6-f41b-4251-ae88-aa34a5714182-kube-api-access-85wdd\") pod \"ea7323d6-f41b-4251-ae88-aa34a5714182\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835740 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-catalog-content\") pod \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835828 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-utilities\") pod \"98f142ad-f9c5-41ee-81ec-632938796964\" (UID: \"98f142ad-f9c5-41ee-81ec-632938796964\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835920 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-trusted-ca\") pod \"ea7323d6-f41b-4251-ae88-aa34a5714182\" (UID: \"ea7323d6-f41b-4251-ae88-aa34a5714182\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.835995 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlzjm\" (UniqueName: \"kubernetes.io/projected/7c355048-396b-4f00-8ff6-1ffff1d9d62c-kube-api-access-vlzjm\") pod \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\" (UID: \"7c355048-396b-4f00-8ff6-1ffff1d9d62c\") " Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.836265 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.836283 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.836323 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrk9\" (UniqueName: \"kubernetes.io/projected/aeaec6eb-91cb-4e68-807f-994b4e9df360-kube-api-access-kfrk9\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.836341 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j99w9\" (UniqueName: \"kubernetes.io/projected/333059fe-3e95-4e08-b70e-d7d95e1ed279-kube-api-access-j99w9\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.841553 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aeaec6eb-91cb-4e68-807f-994b4e9df360" (UID: "aeaec6eb-91cb-4e68-807f-994b4e9df360"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.841758 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c355048-396b-4f00-8ff6-1ffff1d9d62c-kube-api-access-vlzjm" (OuterVolumeSpecName: "kube-api-access-vlzjm") pod "7c355048-396b-4f00-8ff6-1ffff1d9d62c" (UID: "7c355048-396b-4f00-8ff6-1ffff1d9d62c"). InnerVolumeSpecName "kube-api-access-vlzjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.844810 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-utilities" (OuterVolumeSpecName: "utilities") pod "98f142ad-f9c5-41ee-81ec-632938796964" (UID: "98f142ad-f9c5-41ee-81ec-632938796964"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.845164 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ea7323d6-f41b-4251-ae88-aa34a5714182" (UID: "ea7323d6-f41b-4251-ae88-aa34a5714182"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.845454 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7323d6-f41b-4251-ae88-aa34a5714182-kube-api-access-85wdd" (OuterVolumeSpecName: "kube-api-access-85wdd") pod "ea7323d6-f41b-4251-ae88-aa34a5714182" (UID: "ea7323d6-f41b-4251-ae88-aa34a5714182"). InnerVolumeSpecName "kube-api-access-85wdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.861412 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-utilities" (OuterVolumeSpecName: "utilities") pod "7c355048-396b-4f00-8ff6-1ffff1d9d62c" (UID: "7c355048-396b-4f00-8ff6-1ffff1d9d62c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.862012 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ea7323d6-f41b-4251-ae88-aa34a5714182" (UID: "ea7323d6-f41b-4251-ae88-aa34a5714182"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.865365 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f142ad-f9c5-41ee-81ec-632938796964-kube-api-access-rb7d2" (OuterVolumeSpecName: "kube-api-access-rb7d2") pod "98f142ad-f9c5-41ee-81ec-632938796964" (UID: "98f142ad-f9c5-41ee-81ec-632938796964"). InnerVolumeSpecName "kube-api-access-rb7d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.899788 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "333059fe-3e95-4e08-b70e-d7d95e1ed279" (UID: "333059fe-3e95-4e08-b70e-d7d95e1ed279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.924714 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98f142ad-f9c5-41ee-81ec-632938796964" (UID: "98f142ad-f9c5-41ee-81ec-632938796964"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936845 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeaec6eb-91cb-4e68-807f-994b4e9df360-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936881 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936896 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlzjm\" (UniqueName: \"kubernetes.io/projected/7c355048-396b-4f00-8ff6-1ffff1d9d62c-kube-api-access-vlzjm\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936906 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936915 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936923 4886 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea7323d6-f41b-4251-ae88-aa34a5714182-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936933 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb7d2\" (UniqueName: \"kubernetes.io/projected/98f142ad-f9c5-41ee-81ec-632938796964-kube-api-access-rb7d2\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936941 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85wdd\" (UniqueName: \"kubernetes.io/projected/ea7323d6-f41b-4251-ae88-aa34a5714182-kube-api-access-85wdd\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936949 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333059fe-3e95-4e08-b70e-d7d95e1ed279-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:56 crc kubenswrapper[4886]: I0314 08:34:56.936957 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f142ad-f9c5-41ee-81ec-632938796964-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.036398 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c355048-396b-4f00-8ff6-1ffff1d9d62c" (UID: "7c355048-396b-4f00-8ff6-1ffff1d9d62c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.037979 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c355048-396b-4f00-8ff6-1ffff1d9d62c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.102881 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v6gf2"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.508634 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dk6zm" event={"ID":"333059fe-3e95-4e08-b70e-d7d95e1ed279","Type":"ContainerDied","Data":"78278b60f8506a5adf01d07fb614619d230bf1489aba2cecfd9c1997eb1dab74"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.508744 4886 scope.go:117] "RemoveContainer" containerID="f7bb36dbc69b0696038358ed5757d50ba889fb73cea54fe439d610b9a75facd5" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.508997 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dk6zm" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.515766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brtwn" event={"ID":"aeaec6eb-91cb-4e68-807f-994b4e9df360","Type":"ContainerDied","Data":"de7195b6639958d8f38e9c07208bd9032aee68af4f66d36ee6b2923bd3222336"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.515823 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brtwn" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.520283 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" event={"ID":"ea7323d6-f41b-4251-ae88-aa34a5714182","Type":"ContainerDied","Data":"57310ba521abc6995588c4e414e207667a42d6880f3f7e7d8ca7771a71e498aa"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.520552 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8lxpz" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.524837 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" event={"ID":"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b","Type":"ContainerStarted","Data":"de4456351a8dbc0249daeee1dc3d8ad43f44b994224b5731bb6de25faf2c8ab5"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.524909 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" event={"ID":"f0bdc4aa-1cef-4951-9c86-47f00e9bc18b","Type":"ContainerStarted","Data":"cfdbdbcf60d4c14aa2b5b94aae8822954fb2ca6cf149178792e29e6f519f768b"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.524937 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.526156 4886 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-v6gf2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.526209 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" podUID="f0bdc4aa-1cef-4951-9c86-47f00e9bc18b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.528936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fpr95" event={"ID":"98f142ad-f9c5-41ee-81ec-632938796964","Type":"ContainerDied","Data":"7e2a5ebe2ad8e8f12833449807a247ced82971a38b19b39fb9b89952c232c182"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.529262 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fpr95" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.531946 4886 scope.go:117] "RemoveContainer" containerID="e6e3e6058ed138078d9874adbc444746ad88235267c989589fa8a8a2f541ef1e" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.539586 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6gfbq" event={"ID":"7c355048-396b-4f00-8ff6-1ffff1d9d62c","Type":"ContainerDied","Data":"2e1fead50a7ebf7adff0fbed317a8fd55c96e153d6ee602449ce84b0372d4a3e"} Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.539679 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6gfbq" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.546575 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dk6zm"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.553075 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dk6zm"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.563971 4886 scope.go:117] "RemoveContainer" containerID="01a4e1c02a75c867b930eb8d20a4f916e28c505fda422a7af3c01dde2b8d77fd" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.566234 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8lxpz"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.568497 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8lxpz"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.586408 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpr95"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.588707 4886 scope.go:117] "RemoveContainer" containerID="341885f5e4d6c9bab49cc8788b107f1e04e61ba85e9472c750fb436d18d008df" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.590150 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fpr95"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.601614 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-brtwn"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.615526 4886 scope.go:117] "RemoveContainer" containerID="cdaaaef5fcef86cbcca5d10d2bef1780d58f63a309a679e99c76b0974dcb888f" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.635101 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-brtwn"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.640723 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" podStartSLOduration=1.6407004490000001 podStartE2EDuration="1.640700449s" podCreationTimestamp="2026-03-14 08:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:34:57.634064649 +0000 UTC m=+432.882516296" watchObservedRunningTime="2026-03-14 08:34:57.640700449 +0000 UTC m=+432.889152086" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.666067 4886 scope.go:117] "RemoveContainer" containerID="d6d672f06fd68000fc57fdeee67e4ec0b65c7a90f9ce5e842fd8578a517d9fd3" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.668421 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6gfbq"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.672581 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6gfbq"] Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.687176 4886 scope.go:117] "RemoveContainer" containerID="d94ba0cb504c18a1e91df4ef4675249b415e72659ca353fcbb769af633f56b26" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.703197 4886 scope.go:117] "RemoveContainer" containerID="093f7029785a049be2197520ad1eefec886f54abe36d623dbac5486932c2c74b" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.718937 4886 scope.go:117] "RemoveContainer" containerID="2c860203d6f40c73708c979c7452ac98020bf424da2e738ee19b4753e8e74c76" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.746939 4886 scope.go:117] "RemoveContainer" containerID="303bb5cda821a79eff730f4d7df823fd10323f9ff64a6c848b3af17dea088d70" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.762234 4886 scope.go:117] "RemoveContainer" containerID="0c7b8ed5b1eae85061215c343c2815e0779f4d51b5129ff92afe3aaa3552bc5c" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.775783 4886 scope.go:117] "RemoveContainer" containerID="90f215602c3b76fcbd4f3a2960d65747d178e02efa98b3b2b17a6383e525a6cc" Mar 14 08:34:57 crc kubenswrapper[4886]: I0314 08:34:57.791283 4886 scope.go:117] "RemoveContainer" containerID="cdf18872eed6ea8a85d4637daa0b53df7bd9e584c31e7752fb38917a73610bae" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.368269 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k6gs4"] Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.369716 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.369911 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.370084 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.370257 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.370385 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.370514 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.370699 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.370861 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.371070 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.371228 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.371389 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.371502 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.372709 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.372781 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.372843 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.372863 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.372889 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.372906 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.372925 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.372937 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="extract-content" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.372960 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.372974 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.372997 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerName="marketplace-operator" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373011 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerName="marketplace-operator" Mar 14 08:34:58 crc kubenswrapper[4886]: E0314 08:34:58.373029 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373042 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="extract-utilities" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373463 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373488 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373506 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373536 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" containerName="marketplace-operator" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.373551 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f142ad-f9c5-41ee-81ec-632938796964" containerName="registry-server" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.375353 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.380505 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.397059 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6gs4"] Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.459182 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6613ed-adc1-4752-8bbd-e01372ddae6d-catalog-content\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.459227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqc5\" (UniqueName: \"kubernetes.io/projected/ed6613ed-adc1-4752-8bbd-e01372ddae6d-kube-api-access-knqc5\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.459277 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6613ed-adc1-4752-8bbd-e01372ddae6d-utilities\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.559873 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6613ed-adc1-4752-8bbd-e01372ddae6d-utilities\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.559937 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-v6gf2" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.559965 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6613ed-adc1-4752-8bbd-e01372ddae6d-catalog-content\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.559990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqc5\" (UniqueName: \"kubernetes.io/projected/ed6613ed-adc1-4752-8bbd-e01372ddae6d-kube-api-access-knqc5\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.561030 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6613ed-adc1-4752-8bbd-e01372ddae6d-catalog-content\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.562022 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mwvfs"] Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.563901 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6613ed-adc1-4752-8bbd-e01372ddae6d-utilities\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.564243 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.568784 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.575001 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwvfs"] Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.618076 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqc5\" (UniqueName: \"kubernetes.io/projected/ed6613ed-adc1-4752-8bbd-e01372ddae6d-kube-api-access-knqc5\") pod \"redhat-marketplace-k6gs4\" (UID: \"ed6613ed-adc1-4752-8bbd-e01372ddae6d\") " pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.661444 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw92f\" (UniqueName: \"kubernetes.io/projected/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-kube-api-access-cw92f\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.661644 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-utilities\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.661843 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-catalog-content\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.710013 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.762620 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw92f\" (UniqueName: \"kubernetes.io/projected/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-kube-api-access-cw92f\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.762706 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-utilities\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.762749 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-catalog-content\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.763829 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-catalog-content\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.764453 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-utilities\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.789249 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw92f\" (UniqueName: \"kubernetes.io/projected/1586ae9d-89e0-4d92-8fbc-99a6d4ec3111-kube-api-access-cw92f\") pod \"redhat-operators-mwvfs\" (UID: \"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111\") " pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:58 crc kubenswrapper[4886]: I0314 08:34:58.893833 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.159731 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mwvfs"] Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.180802 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6gs4"] Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.429282 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333059fe-3e95-4e08-b70e-d7d95e1ed279" path="/var/lib/kubelet/pods/333059fe-3e95-4e08-b70e-d7d95e1ed279/volumes" Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.432418 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c355048-396b-4f00-8ff6-1ffff1d9d62c" path="/var/lib/kubelet/pods/7c355048-396b-4f00-8ff6-1ffff1d9d62c/volumes" Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.434099 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f142ad-f9c5-41ee-81ec-632938796964" path="/var/lib/kubelet/pods/98f142ad-f9c5-41ee-81ec-632938796964/volumes" Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.436725 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeaec6eb-91cb-4e68-807f-994b4e9df360" path="/var/lib/kubelet/pods/aeaec6eb-91cb-4e68-807f-994b4e9df360/volumes" Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.438058 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7323d6-f41b-4251-ae88-aa34a5714182" path="/var/lib/kubelet/pods/ea7323d6-f41b-4251-ae88-aa34a5714182/volumes" Mar 14 08:34:59 crc kubenswrapper[4886]: E0314 08:34:59.440160 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1586ae9d_89e0_4d92_8fbc_99a6d4ec3111.slice/crio-90d3b0843a93ad45680a3820a58090a827ca53c201db0eb53fab8fc0765dbc6d.scope\": RecentStats: unable to find data in memory cache]" Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.564355 4886 generic.go:334] "Generic (PLEG): container finished" podID="ed6613ed-adc1-4752-8bbd-e01372ddae6d" containerID="c592fd73d1635e4bbf79ef537a979faa0c5ef6969e3f11892bd135064c675418" exitCode=0 Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.564607 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6gs4" event={"ID":"ed6613ed-adc1-4752-8bbd-e01372ddae6d","Type":"ContainerDied","Data":"c592fd73d1635e4bbf79ef537a979faa0c5ef6969e3f11892bd135064c675418"} Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.564717 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6gs4" event={"ID":"ed6613ed-adc1-4752-8bbd-e01372ddae6d","Type":"ContainerStarted","Data":"fbceca761b7632452f1e7f0bbf77b8517bebb7514b578aed3e3e8df09acbf3c8"} Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.570209 4886 generic.go:334] "Generic (PLEG): container finished" podID="1586ae9d-89e0-4d92-8fbc-99a6d4ec3111" containerID="90d3b0843a93ad45680a3820a58090a827ca53c201db0eb53fab8fc0765dbc6d" exitCode=0 Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.570422 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwvfs" event={"ID":"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111","Type":"ContainerDied","Data":"90d3b0843a93ad45680a3820a58090a827ca53c201db0eb53fab8fc0765dbc6d"} Mar 14 08:34:59 crc kubenswrapper[4886]: I0314 08:34:59.570811 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwvfs" event={"ID":"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111","Type":"ContainerStarted","Data":"f5afc19edf43c5d0a836af4e2bbc5290192c8ffe49e0a3b0dd3c8b775aaa7c04"} Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.577390 4886 generic.go:334] "Generic (PLEG): container finished" podID="ed6613ed-adc1-4752-8bbd-e01372ddae6d" containerID="5648feff81191eae5ae4489e4ccf578220fe57faae1943299c7980002d554bd6" exitCode=0 Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.577540 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6gs4" event={"ID":"ed6613ed-adc1-4752-8bbd-e01372ddae6d","Type":"ContainerDied","Data":"5648feff81191eae5ae4489e4ccf578220fe57faae1943299c7980002d554bd6"} Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.750715 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g5kmf"] Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.751705 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.753990 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.769011 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5kmf"] Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.901137 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54007e16-87f1-4fed-a62b-d4936c61998c-catalog-content\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.901231 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xql66\" (UniqueName: \"kubernetes.io/projected/54007e16-87f1-4fed-a62b-d4936c61998c-kube-api-access-xql66\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.901350 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54007e16-87f1-4fed-a62b-d4936c61998c-utilities\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.952770 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ld96m"] Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.953847 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.956475 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 14 08:35:00 crc kubenswrapper[4886]: I0314 08:35:00.966630 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ld96m"] Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.002740 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xql66\" (UniqueName: \"kubernetes.io/projected/54007e16-87f1-4fed-a62b-d4936c61998c-kube-api-access-xql66\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.002880 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54007e16-87f1-4fed-a62b-d4936c61998c-utilities\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.002986 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54007e16-87f1-4fed-a62b-d4936c61998c-catalog-content\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.003471 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54007e16-87f1-4fed-a62b-d4936c61998c-utilities\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.003487 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54007e16-87f1-4fed-a62b-d4936c61998c-catalog-content\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.028514 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xql66\" (UniqueName: \"kubernetes.io/projected/54007e16-87f1-4fed-a62b-d4936c61998c-kube-api-access-xql66\") pod \"certified-operators-g5kmf\" (UID: \"54007e16-87f1-4fed-a62b-d4936c61998c\") " pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.103945 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dfv4\" (UniqueName: \"kubernetes.io/projected/c3268114-3be9-4c4c-b225-1c9024a5b341-kube-api-access-2dfv4\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.104038 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3268114-3be9-4c4c-b225-1c9024a5b341-catalog-content\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.104086 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3268114-3be9-4c4c-b225-1c9024a5b341-utilities\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.106761 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.205595 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dfv4\" (UniqueName: \"kubernetes.io/projected/c3268114-3be9-4c4c-b225-1c9024a5b341-kube-api-access-2dfv4\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.205872 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3268114-3be9-4c4c-b225-1c9024a5b341-catalog-content\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.205918 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3268114-3be9-4c4c-b225-1c9024a5b341-utilities\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.206541 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3268114-3be9-4c4c-b225-1c9024a5b341-utilities\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.209368 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3268114-3be9-4c4c-b225-1c9024a5b341-catalog-content\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.223754 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dfv4\" (UniqueName: \"kubernetes.io/projected/c3268114-3be9-4c4c-b225-1c9024a5b341-kube-api-access-2dfv4\") pod \"community-operators-ld96m\" (UID: \"c3268114-3be9-4c4c-b225-1c9024a5b341\") " pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.274614 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.319365 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5kmf"] Mar 14 08:35:01 crc kubenswrapper[4886]: W0314 08:35:01.331504 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54007e16_87f1_4fed_a62b_d4936c61998c.slice/crio-ab9afd23f5c90d2f5e4da65300e6cccc26815288d8b166e3cf32e6d18c7a530b WatchSource:0}: Error finding container ab9afd23f5c90d2f5e4da65300e6cccc26815288d8b166e3cf32e6d18c7a530b: Status 404 returned error can't find the container with id ab9afd23f5c90d2f5e4da65300e6cccc26815288d8b166e3cf32e6d18c7a530b Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.492885 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ld96m"] Mar 14 08:35:01 crc kubenswrapper[4886]: W0314 08:35:01.577924 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3268114_3be9_4c4c_b225_1c9024a5b341.slice/crio-e5a69b533c289ebbd08810823f8b6ed76c5b6b0b825a1f3d2eff71551f39db4f WatchSource:0}: Error finding container e5a69b533c289ebbd08810823f8b6ed76c5b6b0b825a1f3d2eff71551f39db4f: Status 404 returned error can't find the container with id e5a69b533c289ebbd08810823f8b6ed76c5b6b0b825a1f3d2eff71551f39db4f Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.585195 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6gs4" event={"ID":"ed6613ed-adc1-4752-8bbd-e01372ddae6d","Type":"ContainerStarted","Data":"b4f72b798bd69fb2ae7881685aaab55ab94268b7f3ad0cc3a5aeb2552a5e20a1"} Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.586331 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ld96m" event={"ID":"c3268114-3be9-4c4c-b225-1c9024a5b341","Type":"ContainerStarted","Data":"e5a69b533c289ebbd08810823f8b6ed76c5b6b0b825a1f3d2eff71551f39db4f"} Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.588469 4886 generic.go:334] "Generic (PLEG): container finished" podID="1586ae9d-89e0-4d92-8fbc-99a6d4ec3111" containerID="3411156fa2d6b04f51e045b5e7576df89535aba316767ecddedfdbb96892430d" exitCode=0 Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.588511 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwvfs" event={"ID":"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111","Type":"ContainerDied","Data":"3411156fa2d6b04f51e045b5e7576df89535aba316767ecddedfdbb96892430d"} Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.591849 4886 generic.go:334] "Generic (PLEG): container finished" podID="54007e16-87f1-4fed-a62b-d4936c61998c" containerID="40d5d57136451d31554fcfbb462cb66adb841a9ab9e56bd5a684c8b9ecd1494f" exitCode=0 Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.591888 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5kmf" event={"ID":"54007e16-87f1-4fed-a62b-d4936c61998c","Type":"ContainerDied","Data":"40d5d57136451d31554fcfbb462cb66adb841a9ab9e56bd5a684c8b9ecd1494f"} Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.591915 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5kmf" event={"ID":"54007e16-87f1-4fed-a62b-d4936c61998c","Type":"ContainerStarted","Data":"ab9afd23f5c90d2f5e4da65300e6cccc26815288d8b166e3cf32e6d18c7a530b"} Mar 14 08:35:01 crc kubenswrapper[4886]: I0314 08:35:01.618455 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k6gs4" podStartSLOduration=2.046806362 podStartE2EDuration="3.618433171s" podCreationTimestamp="2026-03-14 08:34:58 +0000 UTC" firstStartedPulling="2026-03-14 08:34:59.567243857 +0000 UTC m=+434.815695484" lastFinishedPulling="2026-03-14 08:35:01.138870656 +0000 UTC m=+436.387322293" observedRunningTime="2026-03-14 08:35:01.600645751 +0000 UTC m=+436.849097388" watchObservedRunningTime="2026-03-14 08:35:01.618433171 +0000 UTC m=+436.866884808" Mar 14 08:35:02 crc kubenswrapper[4886]: I0314 08:35:02.609422 4886 generic.go:334] "Generic (PLEG): container finished" podID="54007e16-87f1-4fed-a62b-d4936c61998c" containerID="ad2e316fa22de3ed6411bcb453f01bdb5480a2bf40fab104c13ba1a70e5ba4bf" exitCode=0 Mar 14 08:35:02 crc kubenswrapper[4886]: I0314 08:35:02.609516 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5kmf" event={"ID":"54007e16-87f1-4fed-a62b-d4936c61998c","Type":"ContainerDied","Data":"ad2e316fa22de3ed6411bcb453f01bdb5480a2bf40fab104c13ba1a70e5ba4bf"} Mar 14 08:35:02 crc kubenswrapper[4886]: I0314 08:35:02.611546 4886 generic.go:334] "Generic (PLEG): container finished" podID="c3268114-3be9-4c4c-b225-1c9024a5b341" containerID="cd5ed4d1f6cbe917e0591e13bb08a6e93146349f4741a0ed4b8e87b79b6e4d28" exitCode=0 Mar 14 08:35:02 crc kubenswrapper[4886]: I0314 08:35:02.611672 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ld96m" event={"ID":"c3268114-3be9-4c4c-b225-1c9024a5b341","Type":"ContainerDied","Data":"cd5ed4d1f6cbe917e0591e13bb08a6e93146349f4741a0ed4b8e87b79b6e4d28"} Mar 14 08:35:02 crc kubenswrapper[4886]: I0314 08:35:02.621634 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mwvfs" event={"ID":"1586ae9d-89e0-4d92-8fbc-99a6d4ec3111","Type":"ContainerStarted","Data":"aa192b30550e5c3216043aebe14d6dde66611fffbc91ead7fad3625b202eb39c"} Mar 14 08:35:02 crc kubenswrapper[4886]: I0314 08:35:02.647926 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mwvfs" podStartSLOduration=2.192807058 podStartE2EDuration="4.647912688s" podCreationTimestamp="2026-03-14 08:34:58 +0000 UTC" firstStartedPulling="2026-03-14 08:34:59.573501976 +0000 UTC m=+434.821953613" lastFinishedPulling="2026-03-14 08:35:02.028607606 +0000 UTC m=+437.277059243" observedRunningTime="2026-03-14 08:35:02.642206785 +0000 UTC m=+437.890658422" watchObservedRunningTime="2026-03-14 08:35:02.647912688 +0000 UTC m=+437.896364325" Mar 14 08:35:03 crc kubenswrapper[4886]: I0314 08:35:03.627139 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5kmf" event={"ID":"54007e16-87f1-4fed-a62b-d4936c61998c","Type":"ContainerStarted","Data":"fd919796768734597ccb87a2d7705508514c7d3dd4d582efe51ffec72532a1b8"} Mar 14 08:35:03 crc kubenswrapper[4886]: I0314 08:35:03.629409 4886 generic.go:334] "Generic (PLEG): container finished" podID="c3268114-3be9-4c4c-b225-1c9024a5b341" containerID="b36c33f5f198fbc9d398d735b8682497a5b8939fd7891f41665a0c25bff9999c" exitCode=0 Mar 14 08:35:03 crc kubenswrapper[4886]: I0314 08:35:03.630172 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ld96m" event={"ID":"c3268114-3be9-4c4c-b225-1c9024a5b341","Type":"ContainerDied","Data":"b36c33f5f198fbc9d398d735b8682497a5b8939fd7891f41665a0c25bff9999c"} Mar 14 08:35:03 crc kubenswrapper[4886]: I0314 08:35:03.665705 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g5kmf" podStartSLOduration=2.183321114 podStartE2EDuration="3.665684621s" podCreationTimestamp="2026-03-14 08:35:00 +0000 UTC" firstStartedPulling="2026-03-14 08:35:01.593781704 +0000 UTC m=+436.842233351" lastFinishedPulling="2026-03-14 08:35:03.076145221 +0000 UTC m=+438.324596858" observedRunningTime="2026-03-14 08:35:03.648694584 +0000 UTC m=+438.897146231" watchObservedRunningTime="2026-03-14 08:35:03.665684621 +0000 UTC m=+438.914136258" Mar 14 08:35:04 crc kubenswrapper[4886]: I0314 08:35:04.637684 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ld96m" event={"ID":"c3268114-3be9-4c4c-b225-1c9024a5b341","Type":"ContainerStarted","Data":"35bd080bdf0ea1bfb0757af79192a560192db75db4aebf3f34771f48455eb02d"} Mar 14 08:35:08 crc kubenswrapper[4886]: I0314 08:35:08.710748 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:35:08 crc kubenswrapper[4886]: I0314 08:35:08.711169 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:35:08 crc kubenswrapper[4886]: I0314 08:35:08.772824 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:35:08 crc kubenswrapper[4886]: I0314 08:35:08.795640 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ld96m" podStartSLOduration=7.356762781 podStartE2EDuration="8.795613891s" podCreationTimestamp="2026-03-14 08:35:00 +0000 UTC" firstStartedPulling="2026-03-14 08:35:02.619095422 +0000 UTC m=+437.867547059" lastFinishedPulling="2026-03-14 08:35:04.057946512 +0000 UTC m=+439.306398169" observedRunningTime="2026-03-14 08:35:04.661676399 +0000 UTC m=+439.910128046" watchObservedRunningTime="2026-03-14 08:35:08.795613891 +0000 UTC m=+444.044065548" Mar 14 08:35:08 crc kubenswrapper[4886]: I0314 08:35:08.895159 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:35:08 crc kubenswrapper[4886]: I0314 08:35:08.895205 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:35:09 crc kubenswrapper[4886]: I0314 08:35:09.724184 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k6gs4" Mar 14 08:35:09 crc kubenswrapper[4886]: I0314 08:35:09.946690 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mwvfs" podUID="1586ae9d-89e0-4d92-8fbc-99a6d4ec3111" containerName="registry-server" probeResult="failure" output=< Mar 14 08:35:09 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:35:09 crc kubenswrapper[4886]: > Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.107518 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.107567 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.165309 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.274999 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.275048 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.331983 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.722666 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g5kmf" Mar 14 08:35:11 crc kubenswrapper[4886]: I0314 08:35:11.733812 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ld96m" Mar 14 08:35:18 crc kubenswrapper[4886]: I0314 08:35:18.942714 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:35:19 crc kubenswrapper[4886]: I0314 08:35:19.000268 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mwvfs" Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.066005 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.066491 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.066534 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.067086 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"424638e056b0f6bf732f3fa83ff43260b56297a10248482736a5bafa61119a1d"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.067164 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://424638e056b0f6bf732f3fa83ff43260b56297a10248482736a5bafa61119a1d" gracePeriod=600 Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.761923 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="424638e056b0f6bf732f3fa83ff43260b56297a10248482736a5bafa61119a1d" exitCode=0 Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.762001 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"424638e056b0f6bf732f3fa83ff43260b56297a10248482736a5bafa61119a1d"} Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.762557 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"b6fcb3fc79e936b9bc2b8e43a35854890b02c11cd8b127b894daab5c52af2a2e"} Mar 14 08:35:26 crc kubenswrapper[4886]: I0314 08:35:26.762601 4886 scope.go:117] "RemoveContainer" containerID="701c6f56134d398fe162df1c3747bdbc98df4c6bf9f73f59acde819d374d239a" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.219486 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fdc9849d6-df7m9"] Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.220207 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" podUID="3d3c670e-9026-4f83-8bfc-1c42774c7851" containerName="controller-manager" containerID="cri-o://3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2" gracePeriod=30 Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.589792 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.644510 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jslzm\" (UniqueName: \"kubernetes.io/projected/3d3c670e-9026-4f83-8bfc-1c42774c7851-kube-api-access-jslzm\") pod \"3d3c670e-9026-4f83-8bfc-1c42774c7851\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.644578 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-config\") pod \"3d3c670e-9026-4f83-8bfc-1c42774c7851\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.644630 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-proxy-ca-bundles\") pod \"3d3c670e-9026-4f83-8bfc-1c42774c7851\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.644700 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-client-ca\") pod \"3d3c670e-9026-4f83-8bfc-1c42774c7851\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.645342 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-client-ca" (OuterVolumeSpecName: "client-ca") pod "3d3c670e-9026-4f83-8bfc-1c42774c7851" (UID: "3d3c670e-9026-4f83-8bfc-1c42774c7851"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.645415 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-config" (OuterVolumeSpecName: "config") pod "3d3c670e-9026-4f83-8bfc-1c42774c7851" (UID: "3d3c670e-9026-4f83-8bfc-1c42774c7851"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.645500 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3d3c670e-9026-4f83-8bfc-1c42774c7851" (UID: "3d3c670e-9026-4f83-8bfc-1c42774c7851"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.645639 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3c670e-9026-4f83-8bfc-1c42774c7851-serving-cert\") pod \"3d3c670e-9026-4f83-8bfc-1c42774c7851\" (UID: \"3d3c670e-9026-4f83-8bfc-1c42774c7851\") " Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.646697 4886 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.646720 4886 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-client-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.646733 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d3c670e-9026-4f83-8bfc-1c42774c7851-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.652579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3c670e-9026-4f83-8bfc-1c42774c7851-kube-api-access-jslzm" (OuterVolumeSpecName: "kube-api-access-jslzm") pod "3d3c670e-9026-4f83-8bfc-1c42774c7851" (UID: "3d3c670e-9026-4f83-8bfc-1c42774c7851"). InnerVolumeSpecName "kube-api-access-jslzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.652646 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3c670e-9026-4f83-8bfc-1c42774c7851-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3d3c670e-9026-4f83-8bfc-1c42774c7851" (UID: "3d3c670e-9026-4f83-8bfc-1c42774c7851"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.748167 4886 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d3c670e-9026-4f83-8bfc-1c42774c7851-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.748204 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jslzm\" (UniqueName: \"kubernetes.io/projected/3d3c670e-9026-4f83-8bfc-1c42774c7851-kube-api-access-jslzm\") on node \"crc\" DevicePath \"\"" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.772227 4886 generic.go:334] "Generic (PLEG): container finished" podID="3d3c670e-9026-4f83-8bfc-1c42774c7851" containerID="3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2" exitCode=0 Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.772292 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.772305 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" event={"ID":"3d3c670e-9026-4f83-8bfc-1c42774c7851","Type":"ContainerDied","Data":"3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2"} Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.772331 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdc9849d6-df7m9" event={"ID":"3d3c670e-9026-4f83-8bfc-1c42774c7851","Type":"ContainerDied","Data":"0f6207e24402d7b06426489629bff7f9795dbbcef54778b17b9a8fffa343ac0e"} Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.772348 4886 scope.go:117] "RemoveContainer" containerID="3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.801947 4886 scope.go:117] "RemoveContainer" containerID="3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2" Mar 14 08:35:27 crc kubenswrapper[4886]: E0314 08:35:27.802939 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2\": container with ID starting with 3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2 not found: ID does not exist" containerID="3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.802981 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2"} err="failed to get container status \"3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2\": rpc error: code = NotFound desc = could not find container \"3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2\": container with ID starting with 3311029abf84f130118e4b36886a38b58bc28bf7f2270b64faf5a4a2574912b2 not found: ID does not exist" Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.804221 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fdc9849d6-df7m9"] Mar 14 08:35:27 crc kubenswrapper[4886]: I0314 08:35:27.806470 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fdc9849d6-df7m9"] Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.879862 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66cf58567-vgs2d"] Mar 14 08:35:28 crc kubenswrapper[4886]: E0314 08:35:28.880379 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3c670e-9026-4f83-8bfc-1c42774c7851" containerName="controller-manager" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.880392 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3c670e-9026-4f83-8bfc-1c42774c7851" containerName="controller-manager" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.880489 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3c670e-9026-4f83-8bfc-1c42774c7851" containerName="controller-manager" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.880832 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.886050 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.886301 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.886371 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.893016 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.893360 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.893595 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.903654 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66cf58567-vgs2d"] Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.909249 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.962017 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5qdp\" (UniqueName: \"kubernetes.io/projected/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-kube-api-access-v5qdp\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.962081 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-proxy-ca-bundles\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.962107 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-client-ca\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.962311 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-serving-cert\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:28 crc kubenswrapper[4886]: I0314 08:35:28.962345 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-config\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.063363 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5qdp\" (UniqueName: \"kubernetes.io/projected/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-kube-api-access-v5qdp\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.063463 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-proxy-ca-bundles\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.063499 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-client-ca\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.063543 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-serving-cert\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.063570 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-config\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.064678 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-client-ca\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.064813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-proxy-ca-bundles\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.065194 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-config\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.069295 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-serving-cert\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.082958 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5qdp\" (UniqueName: \"kubernetes.io/projected/1ff77274-c0cf-4ef7-8a62-5ca93de936ac-kube-api-access-v5qdp\") pod \"controller-manager-66cf58567-vgs2d\" (UID: \"1ff77274-c0cf-4ef7-8a62-5ca93de936ac\") " pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.220276 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.427793 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3c670e-9026-4f83-8bfc-1c42774c7851" path="/var/lib/kubelet/pods/3d3c670e-9026-4f83-8bfc-1c42774c7851/volumes" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.493519 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66cf58567-vgs2d"] Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.788525 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" event={"ID":"1ff77274-c0cf-4ef7-8a62-5ca93de936ac","Type":"ContainerStarted","Data":"7c3da2b0eab01ea67b90481ca598cfba372844c636b5d2e93c44dedc33abb11c"} Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.788932 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.788945 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" event={"ID":"1ff77274-c0cf-4ef7-8a62-5ca93de936ac","Type":"ContainerStarted","Data":"a97fee4f79ff79530191f90cf3eff4fbf8c70a24b7d087c238f5117057b95699"} Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.796045 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" Mar 14 08:35:29 crc kubenswrapper[4886]: I0314 08:35:29.814941 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" podStartSLOduration=2.814908073 podStartE2EDuration="2.814908073s" podCreationTimestamp="2026-03-14 08:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:35:29.807722654 +0000 UTC m=+465.056174291" watchObservedRunningTime="2026-03-14 08:35:29.814908073 +0000 UTC m=+465.063359720" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.149924 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557956-rckf9"] Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.152148 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.155762 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.156287 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.156441 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.165197 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-rckf9"] Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.320159 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhtp4\" (UniqueName: \"kubernetes.io/projected/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef-kube-api-access-jhtp4\") pod \"auto-csr-approver-29557956-rckf9\" (UID: \"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef\") " pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.421111 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhtp4\" (UniqueName: \"kubernetes.io/projected/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef-kube-api-access-jhtp4\") pod \"auto-csr-approver-29557956-rckf9\" (UID: \"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef\") " pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.446671 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhtp4\" (UniqueName: \"kubernetes.io/projected/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef-kube-api-access-jhtp4\") pod \"auto-csr-approver-29557956-rckf9\" (UID: \"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef\") " pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.485954 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:00 crc kubenswrapper[4886]: I0314 08:36:00.995359 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-rckf9"] Mar 14 08:36:01 crc kubenswrapper[4886]: I0314 08:36:01.993680 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-rckf9" event={"ID":"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef","Type":"ContainerStarted","Data":"ec1df00f9f87c6466a70b59308e8ca2a9dfb02da5c6bd79bf706d464dda15f86"} Mar 14 08:36:03 crc kubenswrapper[4886]: I0314 08:36:03.003344 4886 generic.go:334] "Generic (PLEG): container finished" podID="ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef" containerID="21611403d80763832725c4e747af0ad04200c7a1b805a1c99549755ac18830c4" exitCode=0 Mar 14 08:36:03 crc kubenswrapper[4886]: I0314 08:36:03.003457 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-rckf9" event={"ID":"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef","Type":"ContainerDied","Data":"21611403d80763832725c4e747af0ad04200c7a1b805a1c99549755ac18830c4"} Mar 14 08:36:04 crc kubenswrapper[4886]: I0314 08:36:04.422518 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:04 crc kubenswrapper[4886]: I0314 08:36:04.575869 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhtp4\" (UniqueName: \"kubernetes.io/projected/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef-kube-api-access-jhtp4\") pod \"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef\" (UID: \"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef\") " Mar 14 08:36:04 crc kubenswrapper[4886]: I0314 08:36:04.584382 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef-kube-api-access-jhtp4" (OuterVolumeSpecName: "kube-api-access-jhtp4") pod "ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef" (UID: "ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef"). InnerVolumeSpecName "kube-api-access-jhtp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:36:04 crc kubenswrapper[4886]: I0314 08:36:04.678742 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhtp4\" (UniqueName: \"kubernetes.io/projected/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef-kube-api-access-jhtp4\") on node \"crc\" DevicePath \"\"" Mar 14 08:36:05 crc kubenswrapper[4886]: I0314 08:36:05.019826 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557956-rckf9" event={"ID":"ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef","Type":"ContainerDied","Data":"ec1df00f9f87c6466a70b59308e8ca2a9dfb02da5c6bd79bf706d464dda15f86"} Mar 14 08:36:05 crc kubenswrapper[4886]: I0314 08:36:05.019889 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec1df00f9f87c6466a70b59308e8ca2a9dfb02da5c6bd79bf706d464dda15f86" Mar 14 08:36:05 crc kubenswrapper[4886]: I0314 08:36:05.019893 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557956-rckf9" Mar 14 08:36:05 crc kubenswrapper[4886]: I0314 08:36:05.490506 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-fd7np"] Mar 14 08:36:05 crc kubenswrapper[4886]: I0314 08:36:05.494323 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557950-fd7np"] Mar 14 08:36:07 crc kubenswrapper[4886]: I0314 08:36:07.427175 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="266a4a60-8fb5-4685-b4ac-621f93829611" path="/var/lib/kubelet/pods/266a4a60-8fb5-4685-b4ac-621f93829611/volumes" Mar 14 08:37:26 crc kubenswrapper[4886]: I0314 08:37:26.066489 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:37:26 crc kubenswrapper[4886]: I0314 08:37:26.067029 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:37:56 crc kubenswrapper[4886]: I0314 08:37:56.066107 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:37:56 crc kubenswrapper[4886]: I0314 08:37:56.066795 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.132243 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557958-8p7xl"] Mar 14 08:38:00 crc kubenswrapper[4886]: E0314 08:38:00.132845 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.132861 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.132989 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef" containerName="oc" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.133446 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.135295 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-8p7xl"] Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.136705 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.136950 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.137134 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.250075 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gvc\" (UniqueName: \"kubernetes.io/projected/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4-kube-api-access-m7gvc\") pod \"auto-csr-approver-29557958-8p7xl\" (UID: \"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4\") " pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.351077 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gvc\" (UniqueName: \"kubernetes.io/projected/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4-kube-api-access-m7gvc\") pod \"auto-csr-approver-29557958-8p7xl\" (UID: \"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4\") " pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.375193 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gvc\" (UniqueName: \"kubernetes.io/projected/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4-kube-api-access-m7gvc\") pod \"auto-csr-approver-29557958-8p7xl\" (UID: \"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4\") " pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.452414 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.671209 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-8p7xl"] Mar 14 08:38:00 crc kubenswrapper[4886]: W0314 08:38:00.676603 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod553eb1d2_6e3f_4bb9_9440_33eda99f5ec4.slice/crio-ed514beed7ce4ed828f46b3c766c585d3941102e39fceb14a3b886ccc0c61f72 WatchSource:0}: Error finding container ed514beed7ce4ed828f46b3c766c585d3941102e39fceb14a3b886ccc0c61f72: Status 404 returned error can't find the container with id ed514beed7ce4ed828f46b3c766c585d3941102e39fceb14a3b886ccc0c61f72 Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.679087 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:38:00 crc kubenswrapper[4886]: I0314 08:38:00.740598 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" event={"ID":"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4","Type":"ContainerStarted","Data":"ed514beed7ce4ed828f46b3c766c585d3941102e39fceb14a3b886ccc0c61f72"} Mar 14 08:38:02 crc kubenswrapper[4886]: I0314 08:38:02.754906 4886 generic.go:334] "Generic (PLEG): container finished" podID="553eb1d2-6e3f-4bb9-9440-33eda99f5ec4" containerID="aa80faec0fa261e4d6eb9c6f15c0dc7ac436eaf928f475ac030fe28cdd944b3f" exitCode=0 Mar 14 08:38:02 crc kubenswrapper[4886]: I0314 08:38:02.755008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" event={"ID":"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4","Type":"ContainerDied","Data":"aa80faec0fa261e4d6eb9c6f15c0dc7ac436eaf928f475ac030fe28cdd944b3f"} Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.079078 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.207089 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7gvc\" (UniqueName: \"kubernetes.io/projected/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4-kube-api-access-m7gvc\") pod \"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4\" (UID: \"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4\") " Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.217625 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4-kube-api-access-m7gvc" (OuterVolumeSpecName: "kube-api-access-m7gvc") pod "553eb1d2-6e3f-4bb9-9440-33eda99f5ec4" (UID: "553eb1d2-6e3f-4bb9-9440-33eda99f5ec4"). InnerVolumeSpecName "kube-api-access-m7gvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.308789 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7gvc\" (UniqueName: \"kubernetes.io/projected/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4-kube-api-access-m7gvc\") on node \"crc\" DevicePath \"\"" Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.770953 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" event={"ID":"553eb1d2-6e3f-4bb9-9440-33eda99f5ec4","Type":"ContainerDied","Data":"ed514beed7ce4ed828f46b3c766c585d3941102e39fceb14a3b886ccc0c61f72"} Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.771005 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed514beed7ce4ed828f46b3c766c585d3941102e39fceb14a3b886ccc0c61f72" Mar 14 08:38:04 crc kubenswrapper[4886]: I0314 08:38:04.771038 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557958-8p7xl" Mar 14 08:38:05 crc kubenswrapper[4886]: I0314 08:38:05.155094 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-fzb8n"] Mar 14 08:38:05 crc kubenswrapper[4886]: I0314 08:38:05.158826 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557952-fzb8n"] Mar 14 08:38:05 crc kubenswrapper[4886]: I0314 08:38:05.433388 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9" path="/var/lib/kubelet/pods/e2d7e1cf-cd9a-4bf9-8f6b-1fccc137d1f9/volumes" Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.066762 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.067411 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.067475 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.068204 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6fcb3fc79e936b9bc2b8e43a35854890b02c11cd8b127b894daab5c52af2a2e"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.068378 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://b6fcb3fc79e936b9bc2b8e43a35854890b02c11cd8b127b894daab5c52af2a2e" gracePeriod=600 Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.964981 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="b6fcb3fc79e936b9bc2b8e43a35854890b02c11cd8b127b894daab5c52af2a2e" exitCode=0 Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.965065 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"b6fcb3fc79e936b9bc2b8e43a35854890b02c11cd8b127b894daab5c52af2a2e"} Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.965396 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"f6954217ba64ea552ccd815951f515123e9e534013eb3eb6220b2286262b3047"} Mar 14 08:38:26 crc kubenswrapper[4886]: I0314 08:38:26.965428 4886 scope.go:117] "RemoveContainer" containerID="424638e056b0f6bf732f3fa83ff43260b56297a10248482736a5bafa61119a1d" Mar 14 08:39:08 crc kubenswrapper[4886]: I0314 08:39:08.345646 4886 scope.go:117] "RemoveContainer" containerID="4e58c47bdef3c8efcdad128728f90a3a67cbe7a1ed838239d371cf9fcf1740b6" Mar 14 08:39:08 crc kubenswrapper[4886]: I0314 08:39:08.391802 4886 scope.go:117] "RemoveContainer" containerID="91db4c5db896d5d251e855e38cd92bd06ffbb91e495f2340eb566f133910aeed" Mar 14 08:39:08 crc kubenswrapper[4886]: I0314 08:39:08.409503 4886 scope.go:117] "RemoveContainer" containerID="fc2c9d407431805b749a61dec6c02dc9adbdbaeca0ebcaca31a7ba39a5980b81" Mar 14 08:39:08 crc kubenswrapper[4886]: I0314 08:39:08.445066 4886 scope.go:117] "RemoveContainer" containerID="ee4da1d0123e2e993efd5e9d103b57de872c1411580c838c2cd361e78267f00b" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.181806 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r6ndz"] Mar 14 08:39:17 crc kubenswrapper[4886]: E0314 08:39:17.183411 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="553eb1d2-6e3f-4bb9-9440-33eda99f5ec4" containerName="oc" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.183481 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="553eb1d2-6e3f-4bb9-9440-33eda99f5ec4" containerName="oc" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.183614 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="553eb1d2-6e3f-4bb9-9440-33eda99f5ec4" containerName="oc" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.184042 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.194262 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r6ndz"] Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261562 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-registry-tls\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261622 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-bound-sa-token\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261649 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/990999f1-c984-45d5-908a-5054f8f10304-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261682 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4sv4\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-kube-api-access-c4sv4\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261736 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/990999f1-c984-45d5-908a-5054f8f10304-registry-certificates\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261770 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/990999f1-c984-45d5-908a-5054f8f10304-trusted-ca\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261796 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/990999f1-c984-45d5-908a-5054f8f10304-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.261826 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.288889 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.363387 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4sv4\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-kube-api-access-c4sv4\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.363609 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/990999f1-c984-45d5-908a-5054f8f10304-registry-certificates\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.363761 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/990999f1-c984-45d5-908a-5054f8f10304-trusted-ca\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.363895 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/990999f1-c984-45d5-908a-5054f8f10304-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.364086 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-registry-tls\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.364220 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-bound-sa-token\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.364348 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/990999f1-c984-45d5-908a-5054f8f10304-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.364220 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/990999f1-c984-45d5-908a-5054f8f10304-ca-trust-extracted\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.365006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/990999f1-c984-45d5-908a-5054f8f10304-registry-certificates\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.365228 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/990999f1-c984-45d5-908a-5054f8f10304-trusted-ca\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.369613 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/990999f1-c984-45d5-908a-5054f8f10304-installation-pull-secrets\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.369659 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-registry-tls\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.377782 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-bound-sa-token\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.380725 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4sv4\" (UniqueName: \"kubernetes.io/projected/990999f1-c984-45d5-908a-5054f8f10304-kube-api-access-c4sv4\") pod \"image-registry-66df7c8f76-r6ndz\" (UID: \"990999f1-c984-45d5-908a-5054f8f10304\") " pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.504269 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:17 crc kubenswrapper[4886]: I0314 08:39:17.722101 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-r6ndz"] Mar 14 08:39:18 crc kubenswrapper[4886]: I0314 08:39:18.290986 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" event={"ID":"990999f1-c984-45d5-908a-5054f8f10304","Type":"ContainerStarted","Data":"e35f92b5579f3f44134b7256b12bca10556fb41abf10ee436a0064ead0806e89"} Mar 14 08:39:18 crc kubenswrapper[4886]: I0314 08:39:18.291028 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" event={"ID":"990999f1-c984-45d5-908a-5054f8f10304","Type":"ContainerStarted","Data":"bd5804afb1534308f8686fc3eca3322384bded9278d0fe2206d6286f6ea3d919"} Mar 14 08:39:18 crc kubenswrapper[4886]: I0314 08:39:18.291133 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:18 crc kubenswrapper[4886]: I0314 08:39:18.320793 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" podStartSLOduration=1.320770557 podStartE2EDuration="1.320770557s" podCreationTimestamp="2026-03-14 08:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:39:18.314914837 +0000 UTC m=+693.563366484" watchObservedRunningTime="2026-03-14 08:39:18.320770557 +0000 UTC m=+693.569222224" Mar 14 08:39:37 crc kubenswrapper[4886]: I0314 08:39:37.511027 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-r6ndz" Mar 14 08:39:37 crc kubenswrapper[4886]: I0314 08:39:37.581894 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wscrd"] Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.128063 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557960-gjsxp"] Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.130011 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.132740 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.132803 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.132973 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.133835 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-gjsxp"] Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.171642 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdnd\" (UniqueName: \"kubernetes.io/projected/9be99ded-dcd2-4773-902b-10b10955e202-kube-api-access-lcdnd\") pod \"auto-csr-approver-29557960-gjsxp\" (UID: \"9be99ded-dcd2-4773-902b-10b10955e202\") " pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.272530 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdnd\" (UniqueName: \"kubernetes.io/projected/9be99ded-dcd2-4773-902b-10b10955e202-kube-api-access-lcdnd\") pod \"auto-csr-approver-29557960-gjsxp\" (UID: \"9be99ded-dcd2-4773-902b-10b10955e202\") " pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.294179 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdnd\" (UniqueName: \"kubernetes.io/projected/9be99ded-dcd2-4773-902b-10b10955e202-kube-api-access-lcdnd\") pod \"auto-csr-approver-29557960-gjsxp\" (UID: \"9be99ded-dcd2-4773-902b-10b10955e202\") " pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.447739 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:00 crc kubenswrapper[4886]: I0314 08:40:00.865803 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-gjsxp"] Mar 14 08:40:01 crc kubenswrapper[4886]: I0314 08:40:01.540910 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" event={"ID":"9be99ded-dcd2-4773-902b-10b10955e202","Type":"ContainerStarted","Data":"ed3e17d5e5193d30265881acbb17b02e203ee8c8efcc845790206242d28aeb04"} Mar 14 08:40:02 crc kubenswrapper[4886]: I0314 08:40:02.547421 4886 generic.go:334] "Generic (PLEG): container finished" podID="9be99ded-dcd2-4773-902b-10b10955e202" containerID="8d3591afc96490593c6cc94d08d44608aa6a7bfd61ac0a75838464a9c63dcd54" exitCode=0 Mar 14 08:40:02 crc kubenswrapper[4886]: I0314 08:40:02.547499 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" event={"ID":"9be99ded-dcd2-4773-902b-10b10955e202","Type":"ContainerDied","Data":"8d3591afc96490593c6cc94d08d44608aa6a7bfd61ac0a75838464a9c63dcd54"} Mar 14 08:40:02 crc kubenswrapper[4886]: I0314 08:40:02.627850 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" podUID="cfbeb8db-2612-468d-8354-32ee6373f57e" containerName="registry" containerID="cri-o://f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915" gracePeriod=30 Mar 14 08:40:02 crc kubenswrapper[4886]: I0314 08:40:02.975473 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.111725 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-tls\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.111794 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-bound-sa-token\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.111988 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112060 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfbeb8db-2612-468d-8354-32ee6373f57e-ca-trust-extracted\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112164 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-trusted-ca\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112208 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfbeb8db-2612-468d-8354-32ee6373f57e-installation-pull-secrets\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112247 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-certificates\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112282 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2dz\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-kube-api-access-7h2dz\") pod \"cfbeb8db-2612-468d-8354-32ee6373f57e\" (UID: \"cfbeb8db-2612-468d-8354-32ee6373f57e\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112785 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.112972 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.113239 4886 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.113444 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfbeb8db-2612-468d-8354-32ee6373f57e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.121273 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfbeb8db-2612-468d-8354-32ee6373f57e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.121283 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.121337 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.125649 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.126209 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-kube-api-access-7h2dz" (OuterVolumeSpecName: "kube-api-access-7h2dz") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "kube-api-access-7h2dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.128545 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfbeb8db-2612-468d-8354-32ee6373f57e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cfbeb8db-2612-468d-8354-32ee6373f57e" (UID: "cfbeb8db-2612-468d-8354-32ee6373f57e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.213905 4886 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.213935 4886 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.213944 4886 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cfbeb8db-2612-468d-8354-32ee6373f57e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.213954 4886 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cfbeb8db-2612-468d-8354-32ee6373f57e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.213964 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h2dz\" (UniqueName: \"kubernetes.io/projected/cfbeb8db-2612-468d-8354-32ee6373f57e-kube-api-access-7h2dz\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.559076 4886 generic.go:334] "Generic (PLEG): container finished" podID="cfbeb8db-2612-468d-8354-32ee6373f57e" containerID="f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915" exitCode=0 Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.559311 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.559980 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" event={"ID":"cfbeb8db-2612-468d-8354-32ee6373f57e","Type":"ContainerDied","Data":"f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915"} Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.560011 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wscrd" event={"ID":"cfbeb8db-2612-468d-8354-32ee6373f57e","Type":"ContainerDied","Data":"d3ee87034513ebbef9d3bc79bbd7eca0d02de1190860b50d3a519b11e7b9a62b"} Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.560030 4886 scope.go:117] "RemoveContainer" containerID="f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.581162 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wscrd"] Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.587367 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wscrd"] Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.594442 4886 scope.go:117] "RemoveContainer" containerID="f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915" Mar 14 08:40:03 crc kubenswrapper[4886]: E0314 08:40:03.594926 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915\": container with ID starting with f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915 not found: ID does not exist" containerID="f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.594963 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915"} err="failed to get container status \"f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915\": rpc error: code = NotFound desc = could not find container \"f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915\": container with ID starting with f5ce5db1086d5a69af2c14f50e734cb03f96d132c43af7acf7f2195aead66915 not found: ID does not exist" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.751422 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.920546 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcdnd\" (UniqueName: \"kubernetes.io/projected/9be99ded-dcd2-4773-902b-10b10955e202-kube-api-access-lcdnd\") pod \"9be99ded-dcd2-4773-902b-10b10955e202\" (UID: \"9be99ded-dcd2-4773-902b-10b10955e202\") " Mar 14 08:40:03 crc kubenswrapper[4886]: I0314 08:40:03.924769 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be99ded-dcd2-4773-902b-10b10955e202-kube-api-access-lcdnd" (OuterVolumeSpecName: "kube-api-access-lcdnd") pod "9be99ded-dcd2-4773-902b-10b10955e202" (UID: "9be99ded-dcd2-4773-902b-10b10955e202"). InnerVolumeSpecName "kube-api-access-lcdnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:40:04 crc kubenswrapper[4886]: I0314 08:40:04.022113 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcdnd\" (UniqueName: \"kubernetes.io/projected/9be99ded-dcd2-4773-902b-10b10955e202-kube-api-access-lcdnd\") on node \"crc\" DevicePath \"\"" Mar 14 08:40:04 crc kubenswrapper[4886]: I0314 08:40:04.565685 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" event={"ID":"9be99ded-dcd2-4773-902b-10b10955e202","Type":"ContainerDied","Data":"ed3e17d5e5193d30265881acbb17b02e203ee8c8efcc845790206242d28aeb04"} Mar 14 08:40:04 crc kubenswrapper[4886]: I0314 08:40:04.565725 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed3e17d5e5193d30265881acbb17b02e203ee8c8efcc845790206242d28aeb04" Mar 14 08:40:04 crc kubenswrapper[4886]: I0314 08:40:04.565789 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557960-gjsxp" Mar 14 08:40:04 crc kubenswrapper[4886]: I0314 08:40:04.810900 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-jq2zf"] Mar 14 08:40:04 crc kubenswrapper[4886]: I0314 08:40:04.817662 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557954-jq2zf"] Mar 14 08:40:05 crc kubenswrapper[4886]: I0314 08:40:05.435668 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2ea03d-a434-4da8-a1e1-18532cb7e0e8" path="/var/lib/kubelet/pods/4f2ea03d-a434-4da8-a1e1-18532cb7e0e8/volumes" Mar 14 08:40:05 crc kubenswrapper[4886]: I0314 08:40:05.437076 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfbeb8db-2612-468d-8354-32ee6373f57e" path="/var/lib/kubelet/pods/cfbeb8db-2612-468d-8354-32ee6373f57e/volumes" Mar 14 08:40:08 crc kubenswrapper[4886]: I0314 08:40:08.499513 4886 scope.go:117] "RemoveContainer" containerID="60690f219ea99479fa8770d1fb6d2809f83423dbceee4a64502919be7a6ba2fd" Mar 14 08:40:26 crc kubenswrapper[4886]: I0314 08:40:26.066489 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:40:26 crc kubenswrapper[4886]: I0314 08:40:26.067656 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:40:56 crc kubenswrapper[4886]: I0314 08:40:56.066539 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:40:56 crc kubenswrapper[4886]: I0314 08:40:56.067299 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.171612 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q"] Mar 14 08:41:15 crc kubenswrapper[4886]: E0314 08:41:15.172436 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbeb8db-2612-468d-8354-32ee6373f57e" containerName="registry" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.172457 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbeb8db-2612-468d-8354-32ee6373f57e" containerName="registry" Mar 14 08:41:15 crc kubenswrapper[4886]: E0314 08:41:15.172469 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be99ded-dcd2-4773-902b-10b10955e202" containerName="oc" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.172476 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be99ded-dcd2-4773-902b-10b10955e202" containerName="oc" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.172593 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be99ded-dcd2-4773-902b-10b10955e202" containerName="oc" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.172616 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfbeb8db-2612-468d-8354-32ee6373f57e" containerName="registry" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.173069 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.179639 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.179894 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.179920 4886 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4tgb9" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.187246 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.196446 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-jhphs"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.197292 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jhphs" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.206357 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-wzgzv"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.207174 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.208909 4886 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9jfgf" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.209155 4886 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dt764" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.215511 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jhphs"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.225979 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-wzgzv"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.268344 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz5p5\" (UniqueName: \"kubernetes.io/projected/dd81d1ee-7b0d-49e7-955b-3b48b78ed81d-kube-api-access-vz5p5\") pod \"cert-manager-webhook-687f57d79b-wzgzv\" (UID: \"dd81d1ee-7b0d-49e7-955b-3b48b78ed81d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.268409 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27sbj\" (UniqueName: \"kubernetes.io/projected/f2da9034-3cb3-453f-acba-8e2c65138035-kube-api-access-27sbj\") pod \"cert-manager-cainjector-cf98fcc89-4gn2q\" (UID: \"f2da9034-3cb3-453f-acba-8e2c65138035\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.268475 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc9zj\" (UniqueName: \"kubernetes.io/projected/a94da143-3ce9-4783-9a37-51dc56105745-kube-api-access-cc9zj\") pod \"cert-manager-858654f9db-jhphs\" (UID: \"a94da143-3ce9-4783-9a37-51dc56105745\") " pod="cert-manager/cert-manager-858654f9db-jhphs" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.369781 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc9zj\" (UniqueName: \"kubernetes.io/projected/a94da143-3ce9-4783-9a37-51dc56105745-kube-api-access-cc9zj\") pod \"cert-manager-858654f9db-jhphs\" (UID: \"a94da143-3ce9-4783-9a37-51dc56105745\") " pod="cert-manager/cert-manager-858654f9db-jhphs" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.369851 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz5p5\" (UniqueName: \"kubernetes.io/projected/dd81d1ee-7b0d-49e7-955b-3b48b78ed81d-kube-api-access-vz5p5\") pod \"cert-manager-webhook-687f57d79b-wzgzv\" (UID: \"dd81d1ee-7b0d-49e7-955b-3b48b78ed81d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.369893 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27sbj\" (UniqueName: \"kubernetes.io/projected/f2da9034-3cb3-453f-acba-8e2c65138035-kube-api-access-27sbj\") pod \"cert-manager-cainjector-cf98fcc89-4gn2q\" (UID: \"f2da9034-3cb3-453f-acba-8e2c65138035\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.389970 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc9zj\" (UniqueName: \"kubernetes.io/projected/a94da143-3ce9-4783-9a37-51dc56105745-kube-api-access-cc9zj\") pod \"cert-manager-858654f9db-jhphs\" (UID: \"a94da143-3ce9-4783-9a37-51dc56105745\") " pod="cert-manager/cert-manager-858654f9db-jhphs" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.394815 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz5p5\" (UniqueName: \"kubernetes.io/projected/dd81d1ee-7b0d-49e7-955b-3b48b78ed81d-kube-api-access-vz5p5\") pod \"cert-manager-webhook-687f57d79b-wzgzv\" (UID: \"dd81d1ee-7b0d-49e7-955b-3b48b78ed81d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.395376 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27sbj\" (UniqueName: \"kubernetes.io/projected/f2da9034-3cb3-453f-acba-8e2c65138035-kube-api-access-27sbj\") pod \"cert-manager-cainjector-cf98fcc89-4gn2q\" (UID: \"f2da9034-3cb3-453f-acba-8e2c65138035\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.490712 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.519787 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jhphs" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.530875 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.759696 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jhphs"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.892338 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q"] Mar 14 08:41:15 crc kubenswrapper[4886]: I0314 08:41:15.991309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" event={"ID":"f2da9034-3cb3-453f-acba-8e2c65138035","Type":"ContainerStarted","Data":"424db92b22c26c79fd3dfcfd33bcbdd558c4f8dc3d4ba8dfe85762c6074ad5d2"} Mar 14 08:41:16 crc kubenswrapper[4886]: I0314 08:41:16.000061 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jhphs" event={"ID":"a94da143-3ce9-4783-9a37-51dc56105745","Type":"ContainerStarted","Data":"2c2c470d4f67a998cf66f28951c3e414393a75b5047439ee591aebd06aef1d96"} Mar 14 08:41:16 crc kubenswrapper[4886]: I0314 08:41:16.005619 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-wzgzv"] Mar 14 08:41:17 crc kubenswrapper[4886]: I0314 08:41:17.005481 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" event={"ID":"dd81d1ee-7b0d-49e7-955b-3b48b78ed81d","Type":"ContainerStarted","Data":"2e34311ad8f298b742cf4eba5de2b0eff327b3bf98d3a9413f44fb1948f20f04"} Mar 14 08:41:20 crc kubenswrapper[4886]: I0314 08:41:20.022924 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jhphs" event={"ID":"a94da143-3ce9-4783-9a37-51dc56105745","Type":"ContainerStarted","Data":"500b241a1c1791adc3cd88d7e4c6ef6af3acd2c916940c758e93510df2a8a878"} Mar 14 08:41:20 crc kubenswrapper[4886]: I0314 08:41:20.024394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" event={"ID":"dd81d1ee-7b0d-49e7-955b-3b48b78ed81d","Type":"ContainerStarted","Data":"f49ea36a94e0e840ed3bfa6736eded280f6e495135c41bb4fa718f4f71931bd5"} Mar 14 08:41:20 crc kubenswrapper[4886]: I0314 08:41:20.024519 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:20 crc kubenswrapper[4886]: I0314 08:41:20.025642 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" event={"ID":"f2da9034-3cb3-453f-acba-8e2c65138035","Type":"ContainerStarted","Data":"064491da38dae5cc5628f85b4ffc40933784a6dfdf2e789cd2778bf5117d287f"} Mar 14 08:41:20 crc kubenswrapper[4886]: I0314 08:41:20.055175 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-jhphs" podStartSLOduration=1.656120147 podStartE2EDuration="5.055159515s" podCreationTimestamp="2026-03-14 08:41:15 +0000 UTC" firstStartedPulling="2026-03-14 08:41:15.773658929 +0000 UTC m=+811.022110566" lastFinishedPulling="2026-03-14 08:41:19.172698297 +0000 UTC m=+814.421149934" observedRunningTime="2026-03-14 08:41:20.040518771 +0000 UTC m=+815.288970408" watchObservedRunningTime="2026-03-14 08:41:20.055159515 +0000 UTC m=+815.303611152" Mar 14 08:41:20 crc kubenswrapper[4886]: I0314 08:41:20.056544 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4gn2q" podStartSLOduration=1.7809829270000002 podStartE2EDuration="5.056536344s" podCreationTimestamp="2026-03-14 08:41:15 +0000 UTC" firstStartedPulling="2026-03-14 08:41:15.897756678 +0000 UTC m=+811.146208315" lastFinishedPulling="2026-03-14 08:41:19.173310095 +0000 UTC m=+814.421761732" observedRunningTime="2026-03-14 08:41:20.053412304 +0000 UTC m=+815.301863951" watchObservedRunningTime="2026-03-14 08:41:20.056536344 +0000 UTC m=+815.304987981" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.248734 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" podStartSLOduration=7.039068362 podStartE2EDuration="10.248700722s" podCreationTimestamp="2026-03-14 08:41:15 +0000 UTC" firstStartedPulling="2026-03-14 08:41:16.013738702 +0000 UTC m=+811.262190339" lastFinishedPulling="2026-03-14 08:41:19.223371062 +0000 UTC m=+814.471822699" observedRunningTime="2026-03-14 08:41:20.076495461 +0000 UTC m=+815.324947098" watchObservedRunningTime="2026-03-14 08:41:25.248700722 +0000 UTC m=+820.497152399" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.250738 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ms4h7"] Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.251595 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-controller" containerID="cri-o://dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.251625 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="nbdb" containerID="cri-o://43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.251765 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="sbdb" containerID="cri-o://3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.251868 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-node" containerID="cri-o://189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.251933 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="northd" containerID="cri-o://9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.251990 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.252071 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-acl-logging" containerID="cri-o://879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.287968 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" containerID="cri-o://73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" gracePeriod=30 Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.534498 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-wzgzv" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.593381 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/3.log" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.595335 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovn-acl-logging/0.log" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.595787 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovn-controller/0.log" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.596204 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652197 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvpns"] Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652402 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652413 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652426 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652432 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652441 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652449 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652455 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652461 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652468 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kubecfg-setup" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652473 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kubecfg-setup" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652479 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="sbdb" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652485 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="sbdb" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652495 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652501 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652508 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="northd" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652513 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="northd" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652520 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652527 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652535 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652542 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652549 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-acl-logging" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652554 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-acl-logging" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652561 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="nbdb" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652567 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="nbdb" Mar 14 08:41:25 crc kubenswrapper[4886]: E0314 08:41:25.652577 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-node" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652582 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-node" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652814 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-node" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652825 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="kube-rbac-proxy-ovn-metrics" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652833 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652841 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652848 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-acl-logging" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652855 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652863 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="northd" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652869 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="nbdb" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652881 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovn-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.652888 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="sbdb" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.653040 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.653047 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerName="ovnkube-controller" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.654553 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702632 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-etc-openvswitch\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702685 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-log-socket\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702728 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-script-lib\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702751 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-openvswitch\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702765 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-var-lib-openvswitch\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702780 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702806 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-systemd-units\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702819 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-ovn-kubernetes\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702833 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-bin\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702853 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-env-overrides\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702868 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-kubelet\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702886 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-slash\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702908 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-config\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702940 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcw6x\" (UniqueName: \"kubernetes.io/projected/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-kube-api-access-jcw6x\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702963 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-node-log\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.702985 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-netns\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703002 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-ovn\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703018 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovn-node-metrics-cert\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-systemd\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703054 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-netd\") pod \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\" (UID: \"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea\") " Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-run-netns\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703175 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovnkube-config\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703194 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-ovn\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703219 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-kubelet\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703232 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703237 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-node-log\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703272 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-slash\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-etc-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703308 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703327 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-var-lib-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703355 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-log-socket\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703378 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703406 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703424 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzhp\" (UniqueName: \"kubernetes.io/projected/4e095148-948c-4a17-81db-ccf1ec5d2a12-kube-api-access-vvzhp\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703444 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovn-node-metrics-cert\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703472 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-cni-netd\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703492 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-cni-bin\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703508 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovnkube-script-lib\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703527 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-systemd-units\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703554 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-systemd\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703575 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-env-overrides\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703612 4886 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703315 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703349 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-slash" (OuterVolumeSpecName: "host-slash") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703355 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703402 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703429 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-log-socket" (OuterVolumeSpecName: "log-socket") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703653 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703673 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703689 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703708 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703823 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703951 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703967 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-node-log" (OuterVolumeSpecName: "node-log") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703981 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.703994 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.704023 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.704054 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804774 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-cni-netd\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804823 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-cni-bin\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804851 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovnkube-script-lib\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804874 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-systemd-units\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804901 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-systemd\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804926 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-env-overrides\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804953 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-run-netns\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovnkube-config\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804997 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-ovn\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805030 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-cni-netd\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805036 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-kubelet\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805082 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-kubelet\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805106 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-node-log\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805178 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-slash\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805224 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-etc-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805262 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805337 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-var-lib-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805399 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-log-socket\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805440 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805525 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzhp\" (UniqueName: \"kubernetes.io/projected/4e095148-948c-4a17-81db-ccf1ec5d2a12-kube-api-access-vvzhp\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805614 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovn-node-metrics-cert\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805756 4886 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-log-socket\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805779 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805800 4886 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805819 4886 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805837 4886 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805859 4886 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805902 4886 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805921 4886 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805934 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovnkube-script-lib\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805940 4886 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805957 4886 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805974 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-systemd-units\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.805975 4886 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-slash\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806002 4886 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-node-log\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806011 4886 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806023 4886 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806032 4886 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806042 4886 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806064 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-systemd\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806386 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-env-overrides\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806416 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-run-netns\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806830 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovnkube-config\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.806877 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-ovn\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.804986 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-cni-bin\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807034 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-var-lib-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807062 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-node-log\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807082 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-slash\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807149 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-etc-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807181 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-run-openvswitch\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807217 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807241 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-log-socket\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.807391 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4e095148-948c-4a17-81db-ccf1ec5d2a12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.811343 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4e095148-948c-4a17-81db-ccf1ec5d2a12-ovn-node-metrics-cert\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.815565 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.827429 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-kube-api-access-jcw6x" (OuterVolumeSpecName: "kube-api-access-jcw6x") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "kube-api-access-jcw6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.835640 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" (UID: "f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.853881 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzhp\" (UniqueName: \"kubernetes.io/projected/4e095148-948c-4a17-81db-ccf1ec5d2a12-kube-api-access-vvzhp\") pod \"ovnkube-node-kvpns\" (UID: \"4e095148-948c-4a17-81db-ccf1ec5d2a12\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.906861 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcw6x\" (UniqueName: \"kubernetes.io/projected/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-kube-api-access-jcw6x\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.907049 4886 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.907149 4886 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:25 crc kubenswrapper[4886]: I0314 08:41:25.968660 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.056501 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovnkube-controller/3.log" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.058479 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovn-acl-logging/0.log" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.060775 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ms4h7_f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/ovn-controller/0.log" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061207 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" exitCode=0 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061598 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" exitCode=0 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061737 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" exitCode=0 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061820 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" exitCode=0 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061929 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" exitCode=0 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062373 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" exitCode=0 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062475 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" exitCode=143 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062553 4886 generic.go:334] "Generic (PLEG): container finished" podID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" exitCode=143 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061321 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.061315 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062742 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062775 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062791 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062803 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062812 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062822 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062834 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062840 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062845 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062851 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062856 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062861 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062866 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062871 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062878 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062888 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062895 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062900 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062907 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062912 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062917 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062923 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062928 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062934 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062940 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062947 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062956 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062963 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062969 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062978 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062984 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062990 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062996 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063001 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063006 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063011 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063018 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ms4h7" event={"ID":"f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea","Type":"ContainerDied","Data":"6d26fa2c2409306e065a2dd8ef5a81ad7e8edd5931c0e2c62ccc06cf5088b89a"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063026 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063032 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063038 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063043 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063048 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063053 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063058 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063064 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063069 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.063074 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.062826 4886 scope.go:117] "RemoveContainer" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.067250 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.067444 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.067562 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.068332 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6954217ba64ea552ccd815951f515123e9e534013eb3eb6220b2286262b3047"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.068475 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://f6954217ba64ea552ccd815951f515123e9e534013eb3eb6220b2286262b3047" gracePeriod=600 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.070382 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/2.log" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.071688 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/1.log" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.071846 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ed47238-6d20-4920-9162-695e6ddcb090" containerID="06df76c665731f9128bc5d02002f446380ee3b3057ff32d99a3164b686de1ae1" exitCode=2 Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.071956 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerDied","Data":"06df76c665731f9128bc5d02002f446380ee3b3057ff32d99a3164b686de1ae1"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.072032 4886 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.072664 4886 scope.go:117] "RemoveContainer" containerID="06df76c665731f9128bc5d02002f446380ee3b3057ff32d99a3164b686de1ae1" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.076298 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"300c04a3eaef322abc41ae2be4aa63d4bc4ec191807cc9ca6365d222dcff1c45"} Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.087469 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.158818 4886 scope.go:117] "RemoveContainer" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.162625 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ms4h7"] Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.166292 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ms4h7"] Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.174521 4886 scope.go:117] "RemoveContainer" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.203353 4886 scope.go:117] "RemoveContainer" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.237623 4886 scope.go:117] "RemoveContainer" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.261321 4886 scope.go:117] "RemoveContainer" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.277882 4886 scope.go:117] "RemoveContainer" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.295111 4886 scope.go:117] "RemoveContainer" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.309212 4886 scope.go:117] "RemoveContainer" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.322276 4886 scope.go:117] "RemoveContainer" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.322681 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": container with ID starting with 73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c not found: ID does not exist" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.322709 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} err="failed to get container status \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": rpc error: code = NotFound desc = could not find container \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": container with ID starting with 73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.322728 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.323070 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": container with ID starting with 2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c not found: ID does not exist" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.323093 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} err="failed to get container status \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": rpc error: code = NotFound desc = could not find container \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": container with ID starting with 2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.323105 4886 scope.go:117] "RemoveContainer" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.323351 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": container with ID starting with 3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a not found: ID does not exist" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.323374 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} err="failed to get container status \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": rpc error: code = NotFound desc = could not find container \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": container with ID starting with 3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.323387 4886 scope.go:117] "RemoveContainer" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.323715 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": container with ID starting with 43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025 not found: ID does not exist" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.323736 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} err="failed to get container status \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": rpc error: code = NotFound desc = could not find container \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": container with ID starting with 43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.323747 4886 scope.go:117] "RemoveContainer" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.324003 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": container with ID starting with 9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873 not found: ID does not exist" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324029 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} err="failed to get container status \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": rpc error: code = NotFound desc = could not find container \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": container with ID starting with 9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324044 4886 scope.go:117] "RemoveContainer" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.324282 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": container with ID starting with 89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa not found: ID does not exist" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324305 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} err="failed to get container status \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": rpc error: code = NotFound desc = could not find container \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": container with ID starting with 89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324319 4886 scope.go:117] "RemoveContainer" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.324537 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": container with ID starting with 189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8 not found: ID does not exist" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324557 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} err="failed to get container status \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": rpc error: code = NotFound desc = could not find container \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": container with ID starting with 189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324570 4886 scope.go:117] "RemoveContainer" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.324781 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": container with ID starting with 879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42 not found: ID does not exist" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324803 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} err="failed to get container status \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": rpc error: code = NotFound desc = could not find container \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": container with ID starting with 879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.324816 4886 scope.go:117] "RemoveContainer" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.325036 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": container with ID starting with dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b not found: ID does not exist" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325063 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} err="failed to get container status \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": rpc error: code = NotFound desc = could not find container \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": container with ID starting with dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325077 4886 scope.go:117] "RemoveContainer" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" Mar 14 08:41:26 crc kubenswrapper[4886]: E0314 08:41:26.325313 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": container with ID starting with 2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac not found: ID does not exist" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325332 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} err="failed to get container status \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": rpc error: code = NotFound desc = could not find container \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": container with ID starting with 2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325345 4886 scope.go:117] "RemoveContainer" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325585 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} err="failed to get container status \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": rpc error: code = NotFound desc = could not find container \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": container with ID starting with 73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325604 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325906 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} err="failed to get container status \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": rpc error: code = NotFound desc = could not find container \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": container with ID starting with 2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.325931 4886 scope.go:117] "RemoveContainer" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.326242 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} err="failed to get container status \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": rpc error: code = NotFound desc = could not find container \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": container with ID starting with 3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.326266 4886 scope.go:117] "RemoveContainer" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.326583 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} err="failed to get container status \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": rpc error: code = NotFound desc = could not find container \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": container with ID starting with 43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.326609 4886 scope.go:117] "RemoveContainer" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.326876 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} err="failed to get container status \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": rpc error: code = NotFound desc = could not find container \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": container with ID starting with 9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.326900 4886 scope.go:117] "RemoveContainer" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.327209 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} err="failed to get container status \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": rpc error: code = NotFound desc = could not find container \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": container with ID starting with 89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.327230 4886 scope.go:117] "RemoveContainer" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.327438 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} err="failed to get container status \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": rpc error: code = NotFound desc = could not find container \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": container with ID starting with 189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.327461 4886 scope.go:117] "RemoveContainer" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.327736 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} err="failed to get container status \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": rpc error: code = NotFound desc = could not find container \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": container with ID starting with 879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.327754 4886 scope.go:117] "RemoveContainer" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328012 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} err="failed to get container status \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": rpc error: code = NotFound desc = could not find container \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": container with ID starting with dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328031 4886 scope.go:117] "RemoveContainer" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328360 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} err="failed to get container status \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": rpc error: code = NotFound desc = could not find container \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": container with ID starting with 2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328381 4886 scope.go:117] "RemoveContainer" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328607 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} err="failed to get container status \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": rpc error: code = NotFound desc = could not find container \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": container with ID starting with 73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328634 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328854 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} err="failed to get container status \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": rpc error: code = NotFound desc = could not find container \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": container with ID starting with 2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.328872 4886 scope.go:117] "RemoveContainer" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329066 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} err="failed to get container status \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": rpc error: code = NotFound desc = could not find container \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": container with ID starting with 3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329083 4886 scope.go:117] "RemoveContainer" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329291 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} err="failed to get container status \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": rpc error: code = NotFound desc = could not find container \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": container with ID starting with 43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329310 4886 scope.go:117] "RemoveContainer" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329538 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} err="failed to get container status \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": rpc error: code = NotFound desc = could not find container \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": container with ID starting with 9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329555 4886 scope.go:117] "RemoveContainer" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329791 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} err="failed to get container status \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": rpc error: code = NotFound desc = could not find container \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": container with ID starting with 89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.329807 4886 scope.go:117] "RemoveContainer" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330013 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} err="failed to get container status \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": rpc error: code = NotFound desc = could not find container \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": container with ID starting with 189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330028 4886 scope.go:117] "RemoveContainer" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330270 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} err="failed to get container status \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": rpc error: code = NotFound desc = could not find container \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": container with ID starting with 879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330286 4886 scope.go:117] "RemoveContainer" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330540 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} err="failed to get container status \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": rpc error: code = NotFound desc = could not find container \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": container with ID starting with dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330556 4886 scope.go:117] "RemoveContainer" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330778 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} err="failed to get container status \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": rpc error: code = NotFound desc = could not find container \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": container with ID starting with 2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.330806 4886 scope.go:117] "RemoveContainer" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.331648 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} err="failed to get container status \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": rpc error: code = NotFound desc = could not find container \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": container with ID starting with 73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.331672 4886 scope.go:117] "RemoveContainer" containerID="2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.331907 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c"} err="failed to get container status \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": rpc error: code = NotFound desc = could not find container \"2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c\": container with ID starting with 2235f6f6736991136d2d7a4b8812f1fc568ab0ddd76bfa154da2e7da344e965c not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.331931 4886 scope.go:117] "RemoveContainer" containerID="3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332184 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a"} err="failed to get container status \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": rpc error: code = NotFound desc = could not find container \"3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a\": container with ID starting with 3daf8e4c1ee5e50ee674c2e4094e9ff8ec1366989e09ee7c7fd85fb97c83261a not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332200 4886 scope.go:117] "RemoveContainer" containerID="43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332398 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025"} err="failed to get container status \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": rpc error: code = NotFound desc = could not find container \"43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025\": container with ID starting with 43310f68fbd9f3b1d667481987106ecf8c1806624b404df9f8aa8d071346b025 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332417 4886 scope.go:117] "RemoveContainer" containerID="9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332661 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873"} err="failed to get container status \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": rpc error: code = NotFound desc = could not find container \"9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873\": container with ID starting with 9e52d8e8591dad7eab776cabc9fc8dd5af1e170f666af18136d8a64b068b7873 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332682 4886 scope.go:117] "RemoveContainer" containerID="89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.332959 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa"} err="failed to get container status \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": rpc error: code = NotFound desc = could not find container \"89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa\": container with ID starting with 89dafacdbe4e77040e2b9431f38451cb63eab68aac17e8a5209e5f934676edfa not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333004 4886 scope.go:117] "RemoveContainer" containerID="189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333383 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8"} err="failed to get container status \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": rpc error: code = NotFound desc = could not find container \"189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8\": container with ID starting with 189dd34e31fa11ab7de9c2baf359d531f6400b947891f6843422b3da750780b8 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333408 4886 scope.go:117] "RemoveContainer" containerID="879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333616 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42"} err="failed to get container status \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": rpc error: code = NotFound desc = could not find container \"879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42\": container with ID starting with 879f0d225c64153828b78ac319d24f9fb5310c275205ef14541248e2c53cad42 not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333629 4886 scope.go:117] "RemoveContainer" containerID="dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333898 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b"} err="failed to get container status \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": rpc error: code = NotFound desc = could not find container \"dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b\": container with ID starting with dba32fad19e32a1dfc79f531e39319a467f3fa951d498870954d293b8361135b not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.333916 4886 scope.go:117] "RemoveContainer" containerID="2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.334167 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac"} err="failed to get container status \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": rpc error: code = NotFound desc = could not find container \"2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac\": container with ID starting with 2536cceee4f37c5c159a2c60f16b11c64e42ffc86d25b722c339d64f4f00ccac not found: ID does not exist" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.334188 4886 scope.go:117] "RemoveContainer" containerID="73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c" Mar 14 08:41:26 crc kubenswrapper[4886]: I0314 08:41:26.334471 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c"} err="failed to get container status \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": rpc error: code = NotFound desc = could not find container \"73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c\": container with ID starting with 73e348609a48b0febcece17d8a1400ccd134766432885b5b3b30d2260e312d8c not found: ID does not exist" Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.089545 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="f6954217ba64ea552ccd815951f515123e9e534013eb3eb6220b2286262b3047" exitCode=0 Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.089869 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"f6954217ba64ea552ccd815951f515123e9e534013eb3eb6220b2286262b3047"} Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.089892 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"c3f165f2b40174eab0175613b347b88b554cd9e063558e15142f42dfea385fce"} Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.089909 4886 scope.go:117] "RemoveContainer" containerID="b6fcb3fc79e936b9bc2b8e43a35854890b02c11cd8b127b894daab5c52af2a2e" Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.093229 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/2.log" Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.093543 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/1.log" Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.093635 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5jrmb" event={"ID":"7ed47238-6d20-4920-9162-695e6ddcb090","Type":"ContainerStarted","Data":"bab87c8b37a6694c7f828428ccd6b57f686c637bbd791cbffb78e766c3bf3ff5"} Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.096929 4886 generic.go:334] "Generic (PLEG): container finished" podID="4e095148-948c-4a17-81db-ccf1ec5d2a12" containerID="729651ffbcf9b35636fdab17c20c521045ecfb23a95b06d5fefb8ff613c0cc2a" exitCode=0 Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.096961 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerDied","Data":"729651ffbcf9b35636fdab17c20c521045ecfb23a95b06d5fefb8ff613c0cc2a"} Mar 14 08:41:27 crc kubenswrapper[4886]: I0314 08:41:27.427111 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea" path="/var/lib/kubelet/pods/f1a3ba0c-1e4b-4d8f-8bf4-e37003b0bfea/volumes" Mar 14 08:41:28 crc kubenswrapper[4886]: I0314 08:41:28.106036 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"0f2b88e4c04f075964749800dd8ffa0e8c3d85e521888ea35c8e41895d530147"} Mar 14 08:41:28 crc kubenswrapper[4886]: I0314 08:41:28.106549 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"3c8702ff6eeaf9d0905b6df52d79830cd3aa352da8786f1729f4b5d335bdeaed"} Mar 14 08:41:28 crc kubenswrapper[4886]: I0314 08:41:28.106561 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"da186cb082440844f94c9795341948288825a7ff51fb7a86b03a8f895bea8edd"} Mar 14 08:41:28 crc kubenswrapper[4886]: I0314 08:41:28.106572 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"0254383c74b1a6e5d356817f7868c4209416ec00474e91d2e48e5f02a87f34b7"} Mar 14 08:41:28 crc kubenswrapper[4886]: I0314 08:41:28.106582 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"8c9075d87813d945bd970b1f3e1396e9b702064b986e47901989c70529ed475e"} Mar 14 08:41:28 crc kubenswrapper[4886]: I0314 08:41:28.106591 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"a8635903c7a7131ed42f549efccdbeab9a035fb6b66c676b1700bc9591de7f78"} Mar 14 08:41:30 crc kubenswrapper[4886]: I0314 08:41:30.120462 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"c04a49228ac7cc80c345f1b9040cd0454b8034a0495c56d9369627ac6c26d5a5"} Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.143005 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" event={"ID":"4e095148-948c-4a17-81db-ccf1ec5d2a12","Type":"ContainerStarted","Data":"70a36c01e33ab9f47ade29192363376f67674ce47d3f227601410021f48317b4"} Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.143569 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.143672 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.143737 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.177092 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.177964 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" podStartSLOduration=8.177955886 podStartE2EDuration="8.177955886s" podCreationTimestamp="2026-03-14 08:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:41:33.176493224 +0000 UTC m=+828.424944861" watchObservedRunningTime="2026-03-14 08:41:33.177955886 +0000 UTC m=+828.426407523" Mar 14 08:41:33 crc kubenswrapper[4886]: I0314 08:41:33.193242 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:35 crc kubenswrapper[4886]: I0314 08:41:35.527168 4886 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.540498 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm"] Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.542495 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.545248 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.560471 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm"] Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.568016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t987\" (UniqueName: \"kubernetes.io/projected/00bb3cfb-439a-4613-958d-c528ed85df78-kube-api-access-4t987\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.568090 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.568164 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.669974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t987\" (UniqueName: \"kubernetes.io/projected/00bb3cfb-439a-4613-958d-c528ed85df78-kube-api-access-4t987\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.670076 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.670162 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.670768 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.671468 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.692474 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t987\" (UniqueName: \"kubernetes.io/projected/00bb3cfb-439a-4613-958d-c528ed85df78-kube-api-access-4t987\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:53 crc kubenswrapper[4886]: I0314 08:41:53.859163 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:54 crc kubenswrapper[4886]: I0314 08:41:54.104772 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm"] Mar 14 08:41:54 crc kubenswrapper[4886]: I0314 08:41:54.258474 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" event={"ID":"00bb3cfb-439a-4613-958d-c528ed85df78","Type":"ContainerStarted","Data":"1b9d8185965ccf5d82752436f7b2b8600812f3275d6e60f5afcfa9c92ba9d4bd"} Mar 14 08:41:54 crc kubenswrapper[4886]: I0314 08:41:54.258511 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" event={"ID":"00bb3cfb-439a-4613-958d-c528ed85df78","Type":"ContainerStarted","Data":"2a492caa8036dad56bfe2aa2fbe5f18adce9b07ca67d81f415a3ebb566c7737e"} Mar 14 08:41:55 crc kubenswrapper[4886]: I0314 08:41:55.265277 4886 generic.go:334] "Generic (PLEG): container finished" podID="00bb3cfb-439a-4613-958d-c528ed85df78" containerID="1b9d8185965ccf5d82752436f7b2b8600812f3275d6e60f5afcfa9c92ba9d4bd" exitCode=0 Mar 14 08:41:55 crc kubenswrapper[4886]: I0314 08:41:55.265588 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" event={"ID":"00bb3cfb-439a-4613-958d-c528ed85df78","Type":"ContainerDied","Data":"1b9d8185965ccf5d82752436f7b2b8600812f3275d6e60f5afcfa9c92ba9d4bd"} Mar 14 08:41:55 crc kubenswrapper[4886]: I0314 08:41:55.888905 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7rl9s"] Mar 14 08:41:55 crc kubenswrapper[4886]: I0314 08:41:55.890506 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:55 crc kubenswrapper[4886]: I0314 08:41:55.915162 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rl9s"] Mar 14 08:41:55 crc kubenswrapper[4886]: I0314 08:41:55.993440 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvpns" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.006053 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz96q\" (UniqueName: \"kubernetes.io/projected/e919383b-c371-4436-be22-ea712932066f-kube-api-access-nz96q\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.006105 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-utilities\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.006261 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-catalog-content\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.107672 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-catalog-content\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.107777 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz96q\" (UniqueName: \"kubernetes.io/projected/e919383b-c371-4436-be22-ea712932066f-kube-api-access-nz96q\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.107812 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-utilities\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.108263 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-catalog-content\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.108518 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-utilities\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.124977 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz96q\" (UniqueName: \"kubernetes.io/projected/e919383b-c371-4436-be22-ea712932066f-kube-api-access-nz96q\") pod \"redhat-operators-7rl9s\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.216300 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:41:56 crc kubenswrapper[4886]: I0314 08:41:56.399706 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rl9s"] Mar 14 08:41:57 crc kubenswrapper[4886]: I0314 08:41:57.276655 4886 generic.go:334] "Generic (PLEG): container finished" podID="00bb3cfb-439a-4613-958d-c528ed85df78" containerID="555cbb444523187bce91f37844a9fa39d13cfd208d756f9cc8cbaa300e1b9ee1" exitCode=0 Mar 14 08:41:57 crc kubenswrapper[4886]: I0314 08:41:57.276760 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" event={"ID":"00bb3cfb-439a-4613-958d-c528ed85df78","Type":"ContainerDied","Data":"555cbb444523187bce91f37844a9fa39d13cfd208d756f9cc8cbaa300e1b9ee1"} Mar 14 08:41:57 crc kubenswrapper[4886]: I0314 08:41:57.279564 4886 generic.go:334] "Generic (PLEG): container finished" podID="e919383b-c371-4436-be22-ea712932066f" containerID="b15994410ca1f17f90e5a0bbf1d249b1668797e0a23a8bb37d2355b5eeddfd5c" exitCode=0 Mar 14 08:41:57 crc kubenswrapper[4886]: I0314 08:41:57.279627 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerDied","Data":"b15994410ca1f17f90e5a0bbf1d249b1668797e0a23a8bb37d2355b5eeddfd5c"} Mar 14 08:41:57 crc kubenswrapper[4886]: I0314 08:41:57.279681 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerStarted","Data":"63c2934bc6b9e211ab18d399fcc5ee7672655530ff577a4ff40b2dcbec8d0349"} Mar 14 08:41:58 crc kubenswrapper[4886]: I0314 08:41:58.287810 4886 generic.go:334] "Generic (PLEG): container finished" podID="00bb3cfb-439a-4613-958d-c528ed85df78" containerID="af15441ddd8271387e2d8912bd4ccf15631c77ddec98a1bda0126679d3d65587" exitCode=0 Mar 14 08:41:58 crc kubenswrapper[4886]: I0314 08:41:58.287906 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" event={"ID":"00bb3cfb-439a-4613-958d-c528ed85df78","Type":"ContainerDied","Data":"af15441ddd8271387e2d8912bd4ccf15631c77ddec98a1bda0126679d3d65587"} Mar 14 08:41:58 crc kubenswrapper[4886]: I0314 08:41:58.290671 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerStarted","Data":"e87b961f99f9c8ee985c3fb1d30eac905a5f22a9e9a9adbe948fff4127c6c533"} Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.297939 4886 generic.go:334] "Generic (PLEG): container finished" podID="e919383b-c371-4436-be22-ea712932066f" containerID="e87b961f99f9c8ee985c3fb1d30eac905a5f22a9e9a9adbe948fff4127c6c533" exitCode=0 Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.298059 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerDied","Data":"e87b961f99f9c8ee985c3fb1d30eac905a5f22a9e9a9adbe948fff4127c6c533"} Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.530183 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.654169 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t987\" (UniqueName: \"kubernetes.io/projected/00bb3cfb-439a-4613-958d-c528ed85df78-kube-api-access-4t987\") pod \"00bb3cfb-439a-4613-958d-c528ed85df78\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.654252 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-util\") pod \"00bb3cfb-439a-4613-958d-c528ed85df78\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.654334 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-bundle\") pod \"00bb3cfb-439a-4613-958d-c528ed85df78\" (UID: \"00bb3cfb-439a-4613-958d-c528ed85df78\") " Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.657162 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-bundle" (OuterVolumeSpecName: "bundle") pod "00bb3cfb-439a-4613-958d-c528ed85df78" (UID: "00bb3cfb-439a-4613-958d-c528ed85df78"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.661895 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bb3cfb-439a-4613-958d-c528ed85df78-kube-api-access-4t987" (OuterVolumeSpecName: "kube-api-access-4t987") pod "00bb3cfb-439a-4613-958d-c528ed85df78" (UID: "00bb3cfb-439a-4613-958d-c528ed85df78"). InnerVolumeSpecName "kube-api-access-4t987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.756007 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t987\" (UniqueName: \"kubernetes.io/projected/00bb3cfb-439a-4613-958d-c528ed85df78-kube-api-access-4t987\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.756054 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.757614 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-util" (OuterVolumeSpecName: "util") pod "00bb3cfb-439a-4613-958d-c528ed85df78" (UID: "00bb3cfb-439a-4613-958d-c528ed85df78"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:41:59 crc kubenswrapper[4886]: I0314 08:41:59.857342 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bb3cfb-439a-4613-958d-c528ed85df78-util\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.145769 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557962-8wx4j"] Mar 14 08:42:00 crc kubenswrapper[4886]: E0314 08:42:00.146488 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="pull" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.146516 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="pull" Mar 14 08:42:00 crc kubenswrapper[4886]: E0314 08:42:00.146545 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="util" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.146562 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="util" Mar 14 08:42:00 crc kubenswrapper[4886]: E0314 08:42:00.146598 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="extract" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.146615 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="extract" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.146854 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bb3cfb-439a-4613-958d-c528ed85df78" containerName="extract" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.147590 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.150473 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.150496 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.151039 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.162155 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-8wx4j"] Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.261836 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwgv\" (UniqueName: \"kubernetes.io/projected/4099d30b-7d06-4895-bd7b-8851e9ac38f4-kube-api-access-zkwgv\") pod \"auto-csr-approver-29557962-8wx4j\" (UID: \"4099d30b-7d06-4895-bd7b-8851e9ac38f4\") " pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.308325 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.308336 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm" event={"ID":"00bb3cfb-439a-4613-958d-c528ed85df78","Type":"ContainerDied","Data":"2a492caa8036dad56bfe2aa2fbe5f18adce9b07ca67d81f415a3ebb566c7737e"} Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.309044 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a492caa8036dad56bfe2aa2fbe5f18adce9b07ca67d81f415a3ebb566c7737e" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.311224 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerStarted","Data":"20f83b56c959a65251c18392d0af42f372d128fc8d8c1b46f5b77fc370327889"} Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.333279 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7rl9s" podStartSLOduration=2.802245617 podStartE2EDuration="5.333247387s" podCreationTimestamp="2026-03-14 08:41:55 +0000 UTC" firstStartedPulling="2026-03-14 08:41:57.280600309 +0000 UTC m=+852.529051956" lastFinishedPulling="2026-03-14 08:41:59.811602089 +0000 UTC m=+855.060053726" observedRunningTime="2026-03-14 08:42:00.326851551 +0000 UTC m=+855.575303238" watchObservedRunningTime="2026-03-14 08:42:00.333247387 +0000 UTC m=+855.581699064" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.363459 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwgv\" (UniqueName: \"kubernetes.io/projected/4099d30b-7d06-4895-bd7b-8851e9ac38f4-kube-api-access-zkwgv\") pod \"auto-csr-approver-29557962-8wx4j\" (UID: \"4099d30b-7d06-4895-bd7b-8851e9ac38f4\") " pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.383395 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwgv\" (UniqueName: \"kubernetes.io/projected/4099d30b-7d06-4895-bd7b-8851e9ac38f4-kube-api-access-zkwgv\") pod \"auto-csr-approver-29557962-8wx4j\" (UID: \"4099d30b-7d06-4895-bd7b-8851e9ac38f4\") " pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.461939 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:00 crc kubenswrapper[4886]: I0314 08:42:00.670688 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-8wx4j"] Mar 14 08:42:00 crc kubenswrapper[4886]: W0314 08:42:00.674336 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4099d30b_7d06_4895_bd7b_8851e9ac38f4.slice/crio-701ec9c21aafd27c4738bb8c525750c6ed6e407ea0bda6fd785db702c3b410ca WatchSource:0}: Error finding container 701ec9c21aafd27c4738bb8c525750c6ed6e407ea0bda6fd785db702c3b410ca: Status 404 returned error can't find the container with id 701ec9c21aafd27c4738bb8c525750c6ed6e407ea0bda6fd785db702c3b410ca Mar 14 08:42:01 crc kubenswrapper[4886]: I0314 08:42:01.321871 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" event={"ID":"4099d30b-7d06-4895-bd7b-8851e9ac38f4","Type":"ContainerStarted","Data":"701ec9c21aafd27c4738bb8c525750c6ed6e407ea0bda6fd785db702c3b410ca"} Mar 14 08:42:02 crc kubenswrapper[4886]: I0314 08:42:02.330395 4886 generic.go:334] "Generic (PLEG): container finished" podID="4099d30b-7d06-4895-bd7b-8851e9ac38f4" containerID="214f7f9e287f12c439f337966b7749b18b9e2881205eb7dfcab7487ad391bde6" exitCode=0 Mar 14 08:42:02 crc kubenswrapper[4886]: I0314 08:42:02.330441 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" event={"ID":"4099d30b-7d06-4895-bd7b-8851e9ac38f4","Type":"ContainerDied","Data":"214f7f9e287f12c439f337966b7749b18b9e2881205eb7dfcab7487ad391bde6"} Mar 14 08:42:03 crc kubenswrapper[4886]: I0314 08:42:03.574206 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:03 crc kubenswrapper[4886]: I0314 08:42:03.610654 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwgv\" (UniqueName: \"kubernetes.io/projected/4099d30b-7d06-4895-bd7b-8851e9ac38f4-kube-api-access-zkwgv\") pod \"4099d30b-7d06-4895-bd7b-8851e9ac38f4\" (UID: \"4099d30b-7d06-4895-bd7b-8851e9ac38f4\") " Mar 14 08:42:03 crc kubenswrapper[4886]: I0314 08:42:03.619531 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4099d30b-7d06-4895-bd7b-8851e9ac38f4-kube-api-access-zkwgv" (OuterVolumeSpecName: "kube-api-access-zkwgv") pod "4099d30b-7d06-4895-bd7b-8851e9ac38f4" (UID: "4099d30b-7d06-4895-bd7b-8851e9ac38f4"). InnerVolumeSpecName "kube-api-access-zkwgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:03 crc kubenswrapper[4886]: I0314 08:42:03.712543 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwgv\" (UniqueName: \"kubernetes.io/projected/4099d30b-7d06-4895-bd7b-8851e9ac38f4-kube-api-access-zkwgv\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:04 crc kubenswrapper[4886]: I0314 08:42:04.342165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" event={"ID":"4099d30b-7d06-4895-bd7b-8851e9ac38f4","Type":"ContainerDied","Data":"701ec9c21aafd27c4738bb8c525750c6ed6e407ea0bda6fd785db702c3b410ca"} Mar 14 08:42:04 crc kubenswrapper[4886]: I0314 08:42:04.342204 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701ec9c21aafd27c4738bb8c525750c6ed6e407ea0bda6fd785db702c3b410ca" Mar 14 08:42:04 crc kubenswrapper[4886]: I0314 08:42:04.342262 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557962-8wx4j" Mar 14 08:42:04 crc kubenswrapper[4886]: I0314 08:42:04.640893 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-rckf9"] Mar 14 08:42:04 crc kubenswrapper[4886]: I0314 08:42:04.644050 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557956-rckf9"] Mar 14 08:42:05 crc kubenswrapper[4886]: I0314 08:42:05.426476 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef" path="/var/lib/kubelet/pods/ee3db8c9-7bdc-438d-b2cc-a5b5c43624ef/volumes" Mar 14 08:42:06 crc kubenswrapper[4886]: I0314 08:42:06.216907 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:42:06 crc kubenswrapper[4886]: I0314 08:42:06.216957 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:42:07 crc kubenswrapper[4886]: I0314 08:42:07.262436 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7rl9s" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="registry-server" probeResult="failure" output=< Mar 14 08:42:07 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:42:07 crc kubenswrapper[4886]: > Mar 14 08:42:08 crc kubenswrapper[4886]: I0314 08:42:08.573407 4886 scope.go:117] "RemoveContainer" containerID="21611403d80763832725c4e747af0ad04200c7a1b805a1c99549755ac18830c4" Mar 14 08:42:08 crc kubenswrapper[4886]: I0314 08:42:08.607320 4886 scope.go:117] "RemoveContainer" containerID="ec2e46fcb866a7dbf349bb2a83ad1fb6b6e85059612be44d8db4b0134c6f0143" Mar 14 08:42:09 crc kubenswrapper[4886]: I0314 08:42:09.375027 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5jrmb_7ed47238-6d20-4920-9162-695e6ddcb090/kube-multus/2.log" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.450487 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xv587"] Mar 14 08:42:10 crc kubenswrapper[4886]: E0314 08:42:10.450736 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4099d30b-7d06-4895-bd7b-8851e9ac38f4" containerName="oc" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.450752 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4099d30b-7d06-4895-bd7b-8851e9ac38f4" containerName="oc" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.450881 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4099d30b-7d06-4895-bd7b-8851e9ac38f4" containerName="oc" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.451343 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.453285 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rwz8d" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.453387 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.453739 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.466204 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xv587"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.493687 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5lsb\" (UniqueName: \"kubernetes.io/projected/c5e9d625-22c2-434a-ba28-8c7d774dc4fb-kube-api-access-g5lsb\") pod \"obo-prometheus-operator-68bc856cb9-xv587\" (UID: \"c5e9d625-22c2-434a-ba28-8c7d774dc4fb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.504805 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.505492 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.507982 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fjd9l" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.508188 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.523653 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.528525 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.529624 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.537303 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.594991 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb0232bb-45bd-4f84-8b22-1f51604204f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj\" (UID: \"bb0232bb-45bd-4f84-8b22-1f51604204f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.595071 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db69019c-bb09-4732-98ee-c5bb11ab7827-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-mr59p\" (UID: \"db69019c-bb09-4732-98ee-c5bb11ab7827\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.595101 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db69019c-bb09-4732-98ee-c5bb11ab7827-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-mr59p\" (UID: \"db69019c-bb09-4732-98ee-c5bb11ab7827\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.595347 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5lsb\" (UniqueName: \"kubernetes.io/projected/c5e9d625-22c2-434a-ba28-8c7d774dc4fb-kube-api-access-g5lsb\") pod \"obo-prometheus-operator-68bc856cb9-xv587\" (UID: \"c5e9d625-22c2-434a-ba28-8c7d774dc4fb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.595493 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb0232bb-45bd-4f84-8b22-1f51604204f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj\" (UID: \"bb0232bb-45bd-4f84-8b22-1f51604204f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.634070 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5lsb\" (UniqueName: \"kubernetes.io/projected/c5e9d625-22c2-434a-ba28-8c7d774dc4fb-kube-api-access-g5lsb\") pod \"obo-prometheus-operator-68bc856cb9-xv587\" (UID: \"c5e9d625-22c2-434a-ba28-8c7d774dc4fb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.696922 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db69019c-bb09-4732-98ee-c5bb11ab7827-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-mr59p\" (UID: \"db69019c-bb09-4732-98ee-c5bb11ab7827\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.696990 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db69019c-bb09-4732-98ee-c5bb11ab7827-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-mr59p\" (UID: \"db69019c-bb09-4732-98ee-c5bb11ab7827\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.697056 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb0232bb-45bd-4f84-8b22-1f51604204f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj\" (UID: \"bb0232bb-45bd-4f84-8b22-1f51604204f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.697086 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb0232bb-45bd-4f84-8b22-1f51604204f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj\" (UID: \"bb0232bb-45bd-4f84-8b22-1f51604204f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.699864 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb0232bb-45bd-4f84-8b22-1f51604204f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj\" (UID: \"bb0232bb-45bd-4f84-8b22-1f51604204f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.700161 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db69019c-bb09-4732-98ee-c5bb11ab7827-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-mr59p\" (UID: \"db69019c-bb09-4732-98ee-c5bb11ab7827\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.713807 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb0232bb-45bd-4f84-8b22-1f51604204f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj\" (UID: \"bb0232bb-45bd-4f84-8b22-1f51604204f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.713860 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db69019c-bb09-4732-98ee-c5bb11ab7827-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-74f4f68674-mr59p\" (UID: \"db69019c-bb09-4732-98ee-c5bb11ab7827\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.717908 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bxpv7"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.718573 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.720320 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vkk4w" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.720338 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.733754 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bxpv7"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.766955 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.798229 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbnkt\" (UniqueName: \"kubernetes.io/projected/a5156b1d-96a6-46a7-8142-adc240ccd902-kube-api-access-rbnkt\") pod \"observability-operator-59bdc8b94-bxpv7\" (UID: \"a5156b1d-96a6-46a7-8142-adc240ccd902\") " pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.798315 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5156b1d-96a6-46a7-8142-adc240ccd902-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bxpv7\" (UID: \"a5156b1d-96a6-46a7-8142-adc240ccd902\") " pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.818688 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.845196 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.887405 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2k7dn"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.888158 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.899270 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-g4bz2" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.899783 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbnkt\" (UniqueName: \"kubernetes.io/projected/a5156b1d-96a6-46a7-8142-adc240ccd902-kube-api-access-rbnkt\") pod \"observability-operator-59bdc8b94-bxpv7\" (UID: \"a5156b1d-96a6-46a7-8142-adc240ccd902\") " pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.899859 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5156b1d-96a6-46a7-8142-adc240ccd902-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bxpv7\" (UID: \"a5156b1d-96a6-46a7-8142-adc240ccd902\") " pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.900289 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2k7dn"] Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.905991 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5156b1d-96a6-46a7-8142-adc240ccd902-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bxpv7\" (UID: \"a5156b1d-96a6-46a7-8142-adc240ccd902\") " pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:10 crc kubenswrapper[4886]: I0314 08:42:10.916403 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbnkt\" (UniqueName: \"kubernetes.io/projected/a5156b1d-96a6-46a7-8142-adc240ccd902-kube-api-access-rbnkt\") pod \"observability-operator-59bdc8b94-bxpv7\" (UID: \"a5156b1d-96a6-46a7-8142-adc240ccd902\") " pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.003761 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbvd2\" (UniqueName: \"kubernetes.io/projected/ee03958c-108f-48e8-b3ca-c3bd13bfda4a-kube-api-access-nbvd2\") pod \"perses-operator-5bf474d74f-2k7dn\" (UID: \"ee03958c-108f-48e8-b3ca-c3bd13bfda4a\") " pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.003859 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee03958c-108f-48e8-b3ca-c3bd13bfda4a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2k7dn\" (UID: \"ee03958c-108f-48e8-b3ca-c3bd13bfda4a\") " pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.063445 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.105283 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbvd2\" (UniqueName: \"kubernetes.io/projected/ee03958c-108f-48e8-b3ca-c3bd13bfda4a-kube-api-access-nbvd2\") pod \"perses-operator-5bf474d74f-2k7dn\" (UID: \"ee03958c-108f-48e8-b3ca-c3bd13bfda4a\") " pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.105436 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee03958c-108f-48e8-b3ca-c3bd13bfda4a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2k7dn\" (UID: \"ee03958c-108f-48e8-b3ca-c3bd13bfda4a\") " pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.106576 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee03958c-108f-48e8-b3ca-c3bd13bfda4a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2k7dn\" (UID: \"ee03958c-108f-48e8-b3ca-c3bd13bfda4a\") " pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.131205 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbvd2\" (UniqueName: \"kubernetes.io/projected/ee03958c-108f-48e8-b3ca-c3bd13bfda4a-kube-api-access-nbvd2\") pod \"perses-operator-5bf474d74f-2k7dn\" (UID: \"ee03958c-108f-48e8-b3ca-c3bd13bfda4a\") " pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.131418 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p"] Mar 14 08:42:11 crc kubenswrapper[4886]: W0314 08:42:11.155232 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb69019c_bb09_4732_98ee_c5bb11ab7827.slice/crio-85369be0dc0d222d5f46014493cb34a49b6a2ddf70a47691b5387d27f0859f89 WatchSource:0}: Error finding container 85369be0dc0d222d5f46014493cb34a49b6a2ddf70a47691b5387d27f0859f89: Status 404 returned error can't find the container with id 85369be0dc0d222d5f46014493cb34a49b6a2ddf70a47691b5387d27f0859f89 Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.233084 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xv587"] Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.237666 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.248224 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj"] Mar 14 08:42:11 crc kubenswrapper[4886]: W0314 08:42:11.259073 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb0232bb_45bd_4f84_8b22_1f51604204f7.slice/crio-59aa16abe96b106cb85507f6edc2a60b34994d96c0cf92ed4aa6fb3878d2d5b3 WatchSource:0}: Error finding container 59aa16abe96b106cb85507f6edc2a60b34994d96c0cf92ed4aa6fb3878d2d5b3: Status 404 returned error can't find the container with id 59aa16abe96b106cb85507f6edc2a60b34994d96c0cf92ed4aa6fb3878d2d5b3 Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.390785 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" event={"ID":"bb0232bb-45bd-4f84-8b22-1f51604204f7","Type":"ContainerStarted","Data":"59aa16abe96b106cb85507f6edc2a60b34994d96c0cf92ed4aa6fb3878d2d5b3"} Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.392796 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" event={"ID":"db69019c-bb09-4732-98ee-c5bb11ab7827","Type":"ContainerStarted","Data":"85369be0dc0d222d5f46014493cb34a49b6a2ddf70a47691b5387d27f0859f89"} Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.393688 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" event={"ID":"c5e9d625-22c2-434a-ba28-8c7d774dc4fb","Type":"ContainerStarted","Data":"484fccc2b1e6ecc95fd14edd1b422786dd64a895b6783df3566320564076b5d1"} Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.540801 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2k7dn"] Mar 14 08:42:11 crc kubenswrapper[4886]: I0314 08:42:11.544080 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bxpv7"] Mar 14 08:42:11 crc kubenswrapper[4886]: W0314 08:42:11.551280 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5156b1d_96a6_46a7_8142_adc240ccd902.slice/crio-f44bd01946eb57d381415d9176c02c31253ecc79333f35413e46d1516196b57b WatchSource:0}: Error finding container f44bd01946eb57d381415d9176c02c31253ecc79333f35413e46d1516196b57b: Status 404 returned error can't find the container with id f44bd01946eb57d381415d9176c02c31253ecc79333f35413e46d1516196b57b Mar 14 08:42:11 crc kubenswrapper[4886]: W0314 08:42:11.557900 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee03958c_108f_48e8_b3ca_c3bd13bfda4a.slice/crio-1b5cf1b5df7c82163c28e7bf53153c17147ece214ffa45e9352368fef45e2aca WatchSource:0}: Error finding container 1b5cf1b5df7c82163c28e7bf53153c17147ece214ffa45e9352368fef45e2aca: Status 404 returned error can't find the container with id 1b5cf1b5df7c82163c28e7bf53153c17147ece214ffa45e9352368fef45e2aca Mar 14 08:42:12 crc kubenswrapper[4886]: I0314 08:42:12.404452 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" event={"ID":"ee03958c-108f-48e8-b3ca-c3bd13bfda4a","Type":"ContainerStarted","Data":"1b5cf1b5df7c82163c28e7bf53153c17147ece214ffa45e9352368fef45e2aca"} Mar 14 08:42:12 crc kubenswrapper[4886]: I0314 08:42:12.406986 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" event={"ID":"a5156b1d-96a6-46a7-8142-adc240ccd902","Type":"ContainerStarted","Data":"f44bd01946eb57d381415d9176c02c31253ecc79333f35413e46d1516196b57b"} Mar 14 08:42:16 crc kubenswrapper[4886]: I0314 08:42:16.263677 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:42:16 crc kubenswrapper[4886]: I0314 08:42:16.312872 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:42:16 crc kubenswrapper[4886]: I0314 08:42:16.877329 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rl9s"] Mar 14 08:42:17 crc kubenswrapper[4886]: I0314 08:42:17.447903 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7rl9s" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="registry-server" containerID="cri-o://20f83b56c959a65251c18392d0af42f372d128fc8d8c1b46f5b77fc370327889" gracePeriod=2 Mar 14 08:42:18 crc kubenswrapper[4886]: I0314 08:42:18.456793 4886 generic.go:334] "Generic (PLEG): container finished" podID="e919383b-c371-4436-be22-ea712932066f" containerID="20f83b56c959a65251c18392d0af42f372d128fc8d8c1b46f5b77fc370327889" exitCode=0 Mar 14 08:42:18 crc kubenswrapper[4886]: I0314 08:42:18.456869 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerDied","Data":"20f83b56c959a65251c18392d0af42f372d128fc8d8c1b46f5b77fc370327889"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.018438 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.148544 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-catalog-content\") pod \"e919383b-c371-4436-be22-ea712932066f\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.148584 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-utilities\") pod \"e919383b-c371-4436-be22-ea712932066f\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.148625 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz96q\" (UniqueName: \"kubernetes.io/projected/e919383b-c371-4436-be22-ea712932066f-kube-api-access-nz96q\") pod \"e919383b-c371-4436-be22-ea712932066f\" (UID: \"e919383b-c371-4436-be22-ea712932066f\") " Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.150566 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-utilities" (OuterVolumeSpecName: "utilities") pod "e919383b-c371-4436-be22-ea712932066f" (UID: "e919383b-c371-4436-be22-ea712932066f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.154638 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e919383b-c371-4436-be22-ea712932066f-kube-api-access-nz96q" (OuterVolumeSpecName: "kube-api-access-nz96q") pod "e919383b-c371-4436-be22-ea712932066f" (UID: "e919383b-c371-4436-be22-ea712932066f"). InnerVolumeSpecName "kube-api-access-nz96q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.250229 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.250548 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz96q\" (UniqueName: \"kubernetes.io/projected/e919383b-c371-4436-be22-ea712932066f-kube-api-access-nz96q\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.272464 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e919383b-c371-4436-be22-ea712932066f" (UID: "e919383b-c371-4436-be22-ea712932066f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.351802 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e919383b-c371-4436-be22-ea712932066f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.489925 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" event={"ID":"db69019c-bb09-4732-98ee-c5bb11ab7827","Type":"ContainerStarted","Data":"58a4cc20cd23f804beae199366353490606491066985b3a32047a0b440c3e8e3"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.491684 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" event={"ID":"ee03958c-108f-48e8-b3ca-c3bd13bfda4a","Type":"ContainerStarted","Data":"a24bc0d11fe1868a0edb4fb1af1ff11a7f23f8d6d8846c2a99e39f96b3c57b36"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.491811 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.493205 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" event={"ID":"a5156b1d-96a6-46a7-8142-adc240ccd902","Type":"ContainerStarted","Data":"8f2a31580d5a3ceccb932e9ac85b7213f0790ae4f507d611c2e9a3ff2c313553"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.493383 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.494564 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" event={"ID":"c5e9d625-22c2-434a-ba28-8c7d774dc4fb","Type":"ContainerStarted","Data":"1098702de408564f95599d2ac2d8f7ad6099c7386293986220404d9ba2237261"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.496747 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rl9s" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.496745 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rl9s" event={"ID":"e919383b-c371-4436-be22-ea712932066f","Type":"ContainerDied","Data":"63c2934bc6b9e211ab18d399fcc5ee7672655530ff577a4ff40b2dcbec8d0349"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.496868 4886 scope.go:117] "RemoveContainer" containerID="20f83b56c959a65251c18392d0af42f372d128fc8d8c1b46f5b77fc370327889" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.498847 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" event={"ID":"bb0232bb-45bd-4f84-8b22-1f51604204f7","Type":"ContainerStarted","Data":"05ab6d7ea9d539ba5855ad8cc58c2e81719c56ca020d2911803ec917f7f78705"} Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.526748 4886 scope.go:117] "RemoveContainer" containerID="e87b961f99f9c8ee985c3fb1d30eac905a5f22a9e9a9adbe948fff4127c6c533" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.555632 4886 scope.go:117] "RemoveContainer" containerID="b15994410ca1f17f90e5a0bbf1d249b1668797e0a23a8bb37d2355b5eeddfd5c" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.566002 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-mr59p" podStartSLOduration=1.941256601 podStartE2EDuration="14.56598579s" podCreationTimestamp="2026-03-14 08:42:10 +0000 UTC" firstStartedPulling="2026-03-14 08:42:11.160977358 +0000 UTC m=+866.409428995" lastFinishedPulling="2026-03-14 08:42:23.785706557 +0000 UTC m=+879.034158184" observedRunningTime="2026-03-14 08:42:24.526028313 +0000 UTC m=+879.774480030" watchObservedRunningTime="2026-03-14 08:42:24.56598579 +0000 UTC m=+879.814437427" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.568720 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" podStartSLOduration=2.299929671 podStartE2EDuration="14.568710498s" podCreationTimestamp="2026-03-14 08:42:10 +0000 UTC" firstStartedPulling="2026-03-14 08:42:11.55354789 +0000 UTC m=+866.801999527" lastFinishedPulling="2026-03-14 08:42:23.822328717 +0000 UTC m=+879.070780354" observedRunningTime="2026-03-14 08:42:24.564827006 +0000 UTC m=+879.813278643" watchObservedRunningTime="2026-03-14 08:42:24.568710498 +0000 UTC m=+879.817162135" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.579541 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-bxpv7" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.586420 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xv587" podStartSLOduration=2.056103475 podStartE2EDuration="14.586406861s" podCreationTimestamp="2026-03-14 08:42:10 +0000 UTC" firstStartedPulling="2026-03-14 08:42:11.2501863 +0000 UTC m=+866.498637937" lastFinishedPulling="2026-03-14 08:42:23.780489686 +0000 UTC m=+879.028941323" observedRunningTime="2026-03-14 08:42:24.58639994 +0000 UTC m=+879.834851577" watchObservedRunningTime="2026-03-14 08:42:24.586406861 +0000 UTC m=+879.834858498" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.635181 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj" podStartSLOduration=2.093391454 podStartE2EDuration="14.635158472s" podCreationTimestamp="2026-03-14 08:42:10 +0000 UTC" firstStartedPulling="2026-03-14 08:42:11.264842974 +0000 UTC m=+866.513294611" lastFinishedPulling="2026-03-14 08:42:23.806609992 +0000 UTC m=+879.055061629" observedRunningTime="2026-03-14 08:42:24.630900038 +0000 UTC m=+879.879351675" watchObservedRunningTime="2026-03-14 08:42:24.635158472 +0000 UTC m=+879.883610109" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.635549 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" podStartSLOduration=2.415066214 podStartE2EDuration="14.635540663s" podCreationTimestamp="2026-03-14 08:42:10 +0000 UTC" firstStartedPulling="2026-03-14 08:42:11.559991766 +0000 UTC m=+866.808443403" lastFinishedPulling="2026-03-14 08:42:23.780466215 +0000 UTC m=+879.028917852" observedRunningTime="2026-03-14 08:42:24.613854505 +0000 UTC m=+879.862306142" watchObservedRunningTime="2026-03-14 08:42:24.635540663 +0000 UTC m=+879.883992300" Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.645255 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rl9s"] Mar 14 08:42:24 crc kubenswrapper[4886]: I0314 08:42:24.651084 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7rl9s"] Mar 14 08:42:25 crc kubenswrapper[4886]: I0314 08:42:25.428024 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e919383b-c371-4436-be22-ea712932066f" path="/var/lib/kubelet/pods/e919383b-c371-4436-be22-ea712932066f/volumes" Mar 14 08:42:31 crc kubenswrapper[4886]: I0314 08:42:31.241776 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-2k7dn" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.624252 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm"] Mar 14 08:42:47 crc kubenswrapper[4886]: E0314 08:42:47.626202 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="extract-utilities" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.626285 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="extract-utilities" Mar 14 08:42:47 crc kubenswrapper[4886]: E0314 08:42:47.626352 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="extract-content" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.626408 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="extract-content" Mar 14 08:42:47 crc kubenswrapper[4886]: E0314 08:42:47.626486 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="registry-server" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.626548 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="registry-server" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.626703 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e919383b-c371-4436-be22-ea712932066f" containerName="registry-server" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.627508 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.629370 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.634925 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm"] Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.738644 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpmg\" (UniqueName: \"kubernetes.io/projected/db8bb544-5cf5-442c-adda-a2bf39bb77ee-kube-api-access-5bpmg\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.739016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.739148 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.840390 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.840423 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.840468 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpmg\" (UniqueName: \"kubernetes.io/projected/db8bb544-5cf5-442c-adda-a2bf39bb77ee-kube-api-access-5bpmg\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.841186 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.842321 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.858890 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpmg\" (UniqueName: \"kubernetes.io/projected/db8bb544-5cf5-442c-adda-a2bf39bb77ee-kube-api-access-5bpmg\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:47 crc kubenswrapper[4886]: I0314 08:42:47.947751 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:48 crc kubenswrapper[4886]: I0314 08:42:48.386621 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm"] Mar 14 08:42:48 crc kubenswrapper[4886]: W0314 08:42:48.389875 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8bb544_5cf5_442c_adda_a2bf39bb77ee.slice/crio-3084714c974ac0d16166857f39af8e9460c84acb84b91dbf3b8e348fbe996dbd WatchSource:0}: Error finding container 3084714c974ac0d16166857f39af8e9460c84acb84b91dbf3b8e348fbe996dbd: Status 404 returned error can't find the container with id 3084714c974ac0d16166857f39af8e9460c84acb84b91dbf3b8e348fbe996dbd Mar 14 08:42:48 crc kubenswrapper[4886]: I0314 08:42:48.666963 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" event={"ID":"db8bb544-5cf5-442c-adda-a2bf39bb77ee","Type":"ContainerStarted","Data":"5e0dfbac5c51cd70cfec041d1950db769dc32825e83327420c21541794fcad23"} Mar 14 08:42:48 crc kubenswrapper[4886]: I0314 08:42:48.667322 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" event={"ID":"db8bb544-5cf5-442c-adda-a2bf39bb77ee","Type":"ContainerStarted","Data":"3084714c974ac0d16166857f39af8e9460c84acb84b91dbf3b8e348fbe996dbd"} Mar 14 08:42:49 crc kubenswrapper[4886]: I0314 08:42:49.673695 4886 generic.go:334] "Generic (PLEG): container finished" podID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerID="5e0dfbac5c51cd70cfec041d1950db769dc32825e83327420c21541794fcad23" exitCode=0 Mar 14 08:42:49 crc kubenswrapper[4886]: I0314 08:42:49.673783 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" event={"ID":"db8bb544-5cf5-442c-adda-a2bf39bb77ee","Type":"ContainerDied","Data":"5e0dfbac5c51cd70cfec041d1950db769dc32825e83327420c21541794fcad23"} Mar 14 08:42:52 crc kubenswrapper[4886]: I0314 08:42:52.701222 4886 generic.go:334] "Generic (PLEG): container finished" podID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerID="d8e87eeffd0f81a547d3b8b48478de43fa56688cf5dadfbfb8d05e8f07b20e7f" exitCode=0 Mar 14 08:42:52 crc kubenswrapper[4886]: I0314 08:42:52.701302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" event={"ID":"db8bb544-5cf5-442c-adda-a2bf39bb77ee","Type":"ContainerDied","Data":"d8e87eeffd0f81a547d3b8b48478de43fa56688cf5dadfbfb8d05e8f07b20e7f"} Mar 14 08:42:53 crc kubenswrapper[4886]: I0314 08:42:53.710450 4886 generic.go:334] "Generic (PLEG): container finished" podID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerID="740a9141c0619c8ee9a1e14c99ee148715140c98e7caecfb506cb4661b1934af" exitCode=0 Mar 14 08:42:53 crc kubenswrapper[4886]: I0314 08:42:53.710551 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" event={"ID":"db8bb544-5cf5-442c-adda-a2bf39bb77ee","Type":"ContainerDied","Data":"740a9141c0619c8ee9a1e14c99ee148715140c98e7caecfb506cb4661b1934af"} Mar 14 08:42:54 crc kubenswrapper[4886]: I0314 08:42:54.979873 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.065434 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-util\") pod \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.075243 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-util" (OuterVolumeSpecName: "util") pod "db8bb544-5cf5-442c-adda-a2bf39bb77ee" (UID: "db8bb544-5cf5-442c-adda-a2bf39bb77ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.166570 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bpmg\" (UniqueName: \"kubernetes.io/projected/db8bb544-5cf5-442c-adda-a2bf39bb77ee-kube-api-access-5bpmg\") pod \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.166736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-bundle\") pod \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\" (UID: \"db8bb544-5cf5-442c-adda-a2bf39bb77ee\") " Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.166979 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-util\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.167346 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-bundle" (OuterVolumeSpecName: "bundle") pod "db8bb544-5cf5-442c-adda-a2bf39bb77ee" (UID: "db8bb544-5cf5-442c-adda-a2bf39bb77ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.173216 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8bb544-5cf5-442c-adda-a2bf39bb77ee-kube-api-access-5bpmg" (OuterVolumeSpecName: "kube-api-access-5bpmg") pod "db8bb544-5cf5-442c-adda-a2bf39bb77ee" (UID: "db8bb544-5cf5-442c-adda-a2bf39bb77ee"). InnerVolumeSpecName "kube-api-access-5bpmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.267861 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db8bb544-5cf5-442c-adda-a2bf39bb77ee-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.268151 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bpmg\" (UniqueName: \"kubernetes.io/projected/db8bb544-5cf5-442c-adda-a2bf39bb77ee-kube-api-access-5bpmg\") on node \"crc\" DevicePath \"\"" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.733963 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" event={"ID":"db8bb544-5cf5-442c-adda-a2bf39bb77ee","Type":"ContainerDied","Data":"3084714c974ac0d16166857f39af8e9460c84acb84b91dbf3b8e348fbe996dbd"} Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.734018 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3084714c974ac0d16166857f39af8e9460c84acb84b91dbf3b8e348fbe996dbd" Mar 14 08:42:55 crc kubenswrapper[4886]: I0314 08:42:55.734134 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.270611 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-gjn86"] Mar 14 08:42:59 crc kubenswrapper[4886]: E0314 08:42:59.271184 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="extract" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.271205 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="extract" Mar 14 08:42:59 crc kubenswrapper[4886]: E0314 08:42:59.271221 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="pull" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.271230 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="pull" Mar 14 08:42:59 crc kubenswrapper[4886]: E0314 08:42:59.271245 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="util" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.271253 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="util" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.271387 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8bb544-5cf5-442c-adda-a2bf39bb77ee" containerName="extract" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.271865 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.274014 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-594zt" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.275591 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.281029 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-gjn86"] Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.285262 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.419899 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slg9l\" (UniqueName: \"kubernetes.io/projected/d45c2fd4-e4cb-4e0e-ab34-4593293d6829-kube-api-access-slg9l\") pod \"nmstate-operator-796d4cfff4-gjn86\" (UID: \"d45c2fd4-e4cb-4e0e-ab34-4593293d6829\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.521825 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slg9l\" (UniqueName: \"kubernetes.io/projected/d45c2fd4-e4cb-4e0e-ab34-4593293d6829-kube-api-access-slg9l\") pod \"nmstate-operator-796d4cfff4-gjn86\" (UID: \"d45c2fd4-e4cb-4e0e-ab34-4593293d6829\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.542888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slg9l\" (UniqueName: \"kubernetes.io/projected/d45c2fd4-e4cb-4e0e-ab34-4593293d6829-kube-api-access-slg9l\") pod \"nmstate-operator-796d4cfff4-gjn86\" (UID: \"d45c2fd4-e4cb-4e0e-ab34-4593293d6829\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" Mar 14 08:42:59 crc kubenswrapper[4886]: I0314 08:42:59.589405 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" Mar 14 08:43:00 crc kubenswrapper[4886]: I0314 08:43:00.032680 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-gjn86"] Mar 14 08:43:00 crc kubenswrapper[4886]: W0314 08:43:00.043637 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45c2fd4_e4cb_4e0e_ab34_4593293d6829.slice/crio-94f662f88777f52816526069f741b4cabf5762af9d16489035d4fb8f7f96f409 WatchSource:0}: Error finding container 94f662f88777f52816526069f741b4cabf5762af9d16489035d4fb8f7f96f409: Status 404 returned error can't find the container with id 94f662f88777f52816526069f741b4cabf5762af9d16489035d4fb8f7f96f409 Mar 14 08:43:00 crc kubenswrapper[4886]: I0314 08:43:00.767779 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" event={"ID":"d45c2fd4-e4cb-4e0e-ab34-4593293d6829","Type":"ContainerStarted","Data":"94f662f88777f52816526069f741b4cabf5762af9d16489035d4fb8f7f96f409"} Mar 14 08:43:02 crc kubenswrapper[4886]: I0314 08:43:02.779708 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" event={"ID":"d45c2fd4-e4cb-4e0e-ab34-4593293d6829","Type":"ContainerStarted","Data":"bd9d09a688937e6fddd967b2aee53ca61d929518aefdf73e84f7709435efbd6b"} Mar 14 08:43:02 crc kubenswrapper[4886]: I0314 08:43:02.800279 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-gjn86" podStartSLOduration=1.543951487 podStartE2EDuration="3.800240527s" podCreationTimestamp="2026-03-14 08:42:59 +0000 UTC" firstStartedPulling="2026-03-14 08:43:00.045539882 +0000 UTC m=+915.293991519" lastFinishedPulling="2026-03-14 08:43:02.301828922 +0000 UTC m=+917.550280559" observedRunningTime="2026-03-14 08:43:02.795668065 +0000 UTC m=+918.044119722" watchObservedRunningTime="2026-03-14 08:43:02.800240527 +0000 UTC m=+918.048692204" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.324047 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2xvzm"] Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.325236 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.344499 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xvzm"] Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.483635 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-catalog-content\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.483714 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f6gd\" (UniqueName: \"kubernetes.io/projected/e22677e7-de29-404f-89ac-e8b6bb4ad633-kube-api-access-5f6gd\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.483784 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-utilities\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.584701 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-catalog-content\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.584787 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f6gd\" (UniqueName: \"kubernetes.io/projected/e22677e7-de29-404f-89ac-e8b6bb4ad633-kube-api-access-5f6gd\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.584817 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-utilities\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.585256 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-catalog-content\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.585336 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-utilities\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.614941 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f6gd\" (UniqueName: \"kubernetes.io/projected/e22677e7-de29-404f-89ac-e8b6bb4ad633-kube-api-access-5f6gd\") pod \"community-operators-2xvzm\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.640662 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:04 crc kubenswrapper[4886]: I0314 08:43:04.951980 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xvzm"] Mar 14 08:43:05 crc kubenswrapper[4886]: I0314 08:43:05.804541 4886 generic.go:334] "Generic (PLEG): container finished" podID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerID="7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296" exitCode=0 Mar 14 08:43:05 crc kubenswrapper[4886]: I0314 08:43:05.804590 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerDied","Data":"7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296"} Mar 14 08:43:05 crc kubenswrapper[4886]: I0314 08:43:05.804618 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerStarted","Data":"518411a9a6332496735e80c481bc582496dd5c035b74224a929aee398e08b440"} Mar 14 08:43:05 crc kubenswrapper[4886]: I0314 08:43:05.807064 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:43:06 crc kubenswrapper[4886]: I0314 08:43:06.811650 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerStarted","Data":"c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b"} Mar 14 08:43:07 crc kubenswrapper[4886]: I0314 08:43:07.819883 4886 generic.go:334] "Generic (PLEG): container finished" podID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerID="c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b" exitCode=0 Mar 14 08:43:07 crc kubenswrapper[4886]: I0314 08:43:07.822389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerDied","Data":"c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b"} Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.814051 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt"] Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.814948 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.818454 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-pn5q5" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.824460 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt"] Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.828895 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerStarted","Data":"6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806"} Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.841780 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zxg99"] Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.842798 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.847159 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb"] Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.847893 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.849454 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.863035 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb"] Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.876686 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2xvzm" podStartSLOduration=2.454644461 podStartE2EDuration="4.876669979s" podCreationTimestamp="2026-03-14 08:43:04 +0000 UTC" firstStartedPulling="2026-03-14 08:43:05.806863443 +0000 UTC m=+921.055315080" lastFinishedPulling="2026-03-14 08:43:08.228888961 +0000 UTC m=+923.477340598" observedRunningTime="2026-03-14 08:43:08.87497286 +0000 UTC m=+924.123424497" watchObservedRunningTime="2026-03-14 08:43:08.876669979 +0000 UTC m=+924.125121616" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.946650 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-ovs-socket\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.946940 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-dbus-socket\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.947088 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-nmstate-lock\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.947325 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f52p4\" (UniqueName: \"kubernetes.io/projected/6ee1a88c-f687-4369-ac5a-271fccaa1374-kube-api-access-f52p4\") pod \"nmstate-metrics-9b8c8685d-78jgt\" (UID: \"6ee1a88c-f687-4369-ac5a-271fccaa1374\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.947438 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kcwj\" (UniqueName: \"kubernetes.io/projected/ac12ecfd-89d4-41da-a48d-b4b8758afe14-kube-api-access-7kcwj\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.963866 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth"] Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.964603 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.966143 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qfvs4" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.966379 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.967057 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 14 08:43:08 crc kubenswrapper[4886]: I0314 08:43:08.971616 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth"] Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048226 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42d4bb8-a937-4aaa-a074-378bc2f47190-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048428 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f191943-ca71-4273-8a6d-153c6871ab56-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-hjgxb\" (UID: \"8f191943-ca71-4273-8a6d-153c6871ab56\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048556 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f52p4\" (UniqueName: \"kubernetes.io/projected/6ee1a88c-f687-4369-ac5a-271fccaa1374-kube-api-access-f52p4\") pod \"nmstate-metrics-9b8c8685d-78jgt\" (UID: \"6ee1a88c-f687-4369-ac5a-271fccaa1374\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048609 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kcwj\" (UniqueName: \"kubernetes.io/projected/ac12ecfd-89d4-41da-a48d-b4b8758afe14-kube-api-access-7kcwj\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048657 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sffk2\" (UniqueName: \"kubernetes.io/projected/c42d4bb8-a937-4aaa-a074-378bc2f47190-kube-api-access-sffk2\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048702 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrtg\" (UniqueName: \"kubernetes.io/projected/8f191943-ca71-4273-8a6d-153c6871ab56-kube-api-access-jzrtg\") pod \"nmstate-webhook-5f558f5558-hjgxb\" (UID: \"8f191943-ca71-4273-8a6d-153c6871ab56\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048748 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c42d4bb8-a937-4aaa-a074-378bc2f47190-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048787 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-ovs-socket\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048937 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-ovs-socket\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.048991 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-dbus-socket\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.049331 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-dbus-socket\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.049408 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-nmstate-lock\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.049463 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ac12ecfd-89d4-41da-a48d-b4b8758afe14-nmstate-lock\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.070410 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f52p4\" (UniqueName: \"kubernetes.io/projected/6ee1a88c-f687-4369-ac5a-271fccaa1374-kube-api-access-f52p4\") pod \"nmstate-metrics-9b8c8685d-78jgt\" (UID: \"6ee1a88c-f687-4369-ac5a-271fccaa1374\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.074715 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kcwj\" (UniqueName: \"kubernetes.io/projected/ac12ecfd-89d4-41da-a48d-b4b8758afe14-kube-api-access-7kcwj\") pod \"nmstate-handler-zxg99\" (UID: \"ac12ecfd-89d4-41da-a48d-b4b8758afe14\") " pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.137740 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.150584 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42d4bb8-a937-4aaa-a074-378bc2f47190-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.150622 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f191943-ca71-4273-8a6d-153c6871ab56-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-hjgxb\" (UID: \"8f191943-ca71-4273-8a6d-153c6871ab56\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.150651 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sffk2\" (UniqueName: \"kubernetes.io/projected/c42d4bb8-a937-4aaa-a074-378bc2f47190-kube-api-access-sffk2\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.150672 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrtg\" (UniqueName: \"kubernetes.io/projected/8f191943-ca71-4273-8a6d-153c6871ab56-kube-api-access-jzrtg\") pod \"nmstate-webhook-5f558f5558-hjgxb\" (UID: \"8f191943-ca71-4273-8a6d-153c6871ab56\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.150694 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c42d4bb8-a937-4aaa-a074-378bc2f47190-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: E0314 08:43:09.150774 4886 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 14 08:43:09 crc kubenswrapper[4886]: E0314 08:43:09.150853 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c42d4bb8-a937-4aaa-a074-378bc2f47190-plugin-serving-cert podName:c42d4bb8-a937-4aaa-a074-378bc2f47190 nodeName:}" failed. No retries permitted until 2026-03-14 08:43:09.650833293 +0000 UTC m=+924.899284930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/c42d4bb8-a937-4aaa-a074-378bc2f47190-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-stbth" (UID: "c42d4bb8-a937-4aaa-a074-378bc2f47190") : secret "plugin-serving-cert" not found Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.151493 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c42d4bb8-a937-4aaa-a074-378bc2f47190-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.155929 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8f191943-ca71-4273-8a6d-153c6871ab56-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-hjgxb\" (UID: \"8f191943-ca71-4273-8a6d-153c6871ab56\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.160215 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.165270 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-74c4dcc74d-cltfm"] Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.175691 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.190990 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrtg\" (UniqueName: \"kubernetes.io/projected/8f191943-ca71-4273-8a6d-153c6871ab56-kube-api-access-jzrtg\") pod \"nmstate-webhook-5f558f5558-hjgxb\" (UID: \"8f191943-ca71-4273-8a6d-153c6871ab56\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.193366 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74c4dcc74d-cltfm"] Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.198250 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sffk2\" (UniqueName: \"kubernetes.io/projected/c42d4bb8-a937-4aaa-a074-378bc2f47190-kube-api-access-sffk2\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.255943 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-oauth-config\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.256547 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-oauth-serving-cert\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.257091 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-serving-cert\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.257138 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-config\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.257213 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-service-ca\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.257257 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-trusted-ca-bundle\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.257277 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5n5f\" (UniqueName: \"kubernetes.io/projected/dcf46ad7-c706-402f-af06-98ea6b6a1082-kube-api-access-w5n5f\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.357974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-service-ca\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.358031 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-trusted-ca-bundle\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.358053 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5n5f\" (UniqueName: \"kubernetes.io/projected/dcf46ad7-c706-402f-af06-98ea6b6a1082-kube-api-access-w5n5f\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.358075 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-oauth-config\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.358097 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-oauth-serving-cert\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.358130 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-serving-cert\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.358156 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-config\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.359050 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-config\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.359066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-service-ca\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.359330 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-trusted-ca-bundle\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.359425 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcf46ad7-c706-402f-af06-98ea6b6a1082-oauth-serving-cert\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.363863 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-oauth-config\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.367610 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcf46ad7-c706-402f-af06-98ea6b6a1082-console-serving-cert\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.378939 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5n5f\" (UniqueName: \"kubernetes.io/projected/dcf46ad7-c706-402f-af06-98ea6b6a1082-kube-api-access-w5n5f\") pod \"console-74c4dcc74d-cltfm\" (UID: \"dcf46ad7-c706-402f-af06-98ea6b6a1082\") " pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.469402 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.513183 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.595709 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt"] Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.664639 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42d4bb8-a937-4aaa-a074-378bc2f47190-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.667884 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42d4bb8-a937-4aaa-a074-378bc2f47190-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-stbth\" (UID: \"c42d4bb8-a937-4aaa-a074-378bc2f47190\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.674061 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb"] Mar 14 08:43:09 crc kubenswrapper[4886]: W0314 08:43:09.683705 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f191943_ca71_4273_8a6d_153c6871ab56.slice/crio-b77e9665a8ca095e12f9adb7256f0cac6d09311ec038d785ce8e6c342928b746 WatchSource:0}: Error finding container b77e9665a8ca095e12f9adb7256f0cac6d09311ec038d785ce8e6c342928b746: Status 404 returned error can't find the container with id b77e9665a8ca095e12f9adb7256f0cac6d09311ec038d785ce8e6c342928b746 Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.728833 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74c4dcc74d-cltfm"] Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.835865 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74c4dcc74d-cltfm" event={"ID":"dcf46ad7-c706-402f-af06-98ea6b6a1082","Type":"ContainerStarted","Data":"76399f012d99efaaea1ad04d640f25151095028dcf72ab3a0bce404c69645cc1"} Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.837437 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zxg99" event={"ID":"ac12ecfd-89d4-41da-a48d-b4b8758afe14","Type":"ContainerStarted","Data":"c3a7576f21e09e8586b78b82911f1ffc1cfef1f4ee2fed8e38c0b22879810626"} Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.838743 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" event={"ID":"8f191943-ca71-4273-8a6d-153c6871ab56","Type":"ContainerStarted","Data":"b77e9665a8ca095e12f9adb7256f0cac6d09311ec038d785ce8e6c342928b746"} Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.839569 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" event={"ID":"6ee1a88c-f687-4369-ac5a-271fccaa1374","Type":"ContainerStarted","Data":"ff26bc3250aa37f2e2cd0fc02c591d4523c58018f1a2ed4adf8b96fc5e0fefb9"} Mar 14 08:43:09 crc kubenswrapper[4886]: I0314 08:43:09.879782 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" Mar 14 08:43:10 crc kubenswrapper[4886]: I0314 08:43:10.222808 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth"] Mar 14 08:43:10 crc kubenswrapper[4886]: W0314 08:43:10.226720 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42d4bb8_a937_4aaa_a074_378bc2f47190.slice/crio-e96bc4d83421b0c55a75e95019f82344a63c717482a304aec919f3f45a81caf7 WatchSource:0}: Error finding container e96bc4d83421b0c55a75e95019f82344a63c717482a304aec919f3f45a81caf7: Status 404 returned error can't find the container with id e96bc4d83421b0c55a75e95019f82344a63c717482a304aec919f3f45a81caf7 Mar 14 08:43:10 crc kubenswrapper[4886]: I0314 08:43:10.847369 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" event={"ID":"c42d4bb8-a937-4aaa-a074-378bc2f47190","Type":"ContainerStarted","Data":"e96bc4d83421b0c55a75e95019f82344a63c717482a304aec919f3f45a81caf7"} Mar 14 08:43:10 crc kubenswrapper[4886]: I0314 08:43:10.848595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74c4dcc74d-cltfm" event={"ID":"dcf46ad7-c706-402f-af06-98ea6b6a1082","Type":"ContainerStarted","Data":"189a84d517e9da30885b552bda059ba5521d8d826e3e8876755304562e6a5606"} Mar 14 08:43:10 crc kubenswrapper[4886]: I0314 08:43:10.871847 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74c4dcc74d-cltfm" podStartSLOduration=1.8718174410000001 podStartE2EDuration="1.871817441s" podCreationTimestamp="2026-03-14 08:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:43:10.864610883 +0000 UTC m=+926.113062520" watchObservedRunningTime="2026-03-14 08:43:10.871817441 +0000 UTC m=+926.120269118" Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.869048 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" event={"ID":"8f191943-ca71-4273-8a6d-153c6871ab56","Type":"ContainerStarted","Data":"25cf579af7fd65d13cb0e392d259f21f56c45adb17cb40f6bd6f41d19e2a00fa"} Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.869574 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.870936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" event={"ID":"c42d4bb8-a937-4aaa-a074-378bc2f47190","Type":"ContainerStarted","Data":"9576de9357976fef2f7626a90330bd9b49102cbc01a076c40b323fffc98ad7d3"} Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.873352 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" event={"ID":"6ee1a88c-f687-4369-ac5a-271fccaa1374","Type":"ContainerStarted","Data":"92a84f7617666530e435062248f7b48a9a658f062f9c4c7b260a92bd17b6677a"} Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.874618 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zxg99" event={"ID":"ac12ecfd-89d4-41da-a48d-b4b8758afe14","Type":"ContainerStarted","Data":"6b07c6e16a2f56cd9e47324612a89bec5358c94e77dd8f538c9849b45d8f262d"} Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.875045 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.884884 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" podStartSLOduration=2.550905774 podStartE2EDuration="5.884866644s" podCreationTimestamp="2026-03-14 08:43:08 +0000 UTC" firstStartedPulling="2026-03-14 08:43:09.688400321 +0000 UTC m=+924.936851958" lastFinishedPulling="2026-03-14 08:43:13.022361191 +0000 UTC m=+928.270812828" observedRunningTime="2026-03-14 08:43:13.882048252 +0000 UTC m=+929.130499889" watchObservedRunningTime="2026-03-14 08:43:13.884866644 +0000 UTC m=+929.133318271" Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.906830 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zxg99" podStartSLOduration=2.110502798 podStartE2EDuration="5.906811779s" podCreationTimestamp="2026-03-14 08:43:08 +0000 UTC" firstStartedPulling="2026-03-14 08:43:09.215314019 +0000 UTC m=+924.463765656" lastFinishedPulling="2026-03-14 08:43:13.011623 +0000 UTC m=+928.260074637" observedRunningTime="2026-03-14 08:43:13.902914566 +0000 UTC m=+929.151366203" watchObservedRunningTime="2026-03-14 08:43:13.906811779 +0000 UTC m=+929.155263416" Mar 14 08:43:13 crc kubenswrapper[4886]: I0314 08:43:13.922276 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-stbth" podStartSLOduration=3.140805687 podStartE2EDuration="5.922255536s" podCreationTimestamp="2026-03-14 08:43:08 +0000 UTC" firstStartedPulling="2026-03-14 08:43:10.229156312 +0000 UTC m=+925.477607949" lastFinishedPulling="2026-03-14 08:43:13.010606121 +0000 UTC m=+928.259057798" observedRunningTime="2026-03-14 08:43:13.919038713 +0000 UTC m=+929.167490360" watchObservedRunningTime="2026-03-14 08:43:13.922255536 +0000 UTC m=+929.170707173" Mar 14 08:43:14 crc kubenswrapper[4886]: I0314 08:43:14.641822 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:14 crc kubenswrapper[4886]: I0314 08:43:14.641910 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:14 crc kubenswrapper[4886]: I0314 08:43:14.686549 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:14 crc kubenswrapper[4886]: I0314 08:43:14.914153 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:15 crc kubenswrapper[4886]: I0314 08:43:15.888336 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" event={"ID":"6ee1a88c-f687-4369-ac5a-271fccaa1374","Type":"ContainerStarted","Data":"fe161b81c1582c2e419c051a0acca21a30db3f5581ec9c796bbe86c4bb54edf9"} Mar 14 08:43:15 crc kubenswrapper[4886]: I0314 08:43:15.905831 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-78jgt" podStartSLOduration=1.9298550890000001 podStartE2EDuration="7.905810203s" podCreationTimestamp="2026-03-14 08:43:08 +0000 UTC" firstStartedPulling="2026-03-14 08:43:09.620058263 +0000 UTC m=+924.868509900" lastFinishedPulling="2026-03-14 08:43:15.596013377 +0000 UTC m=+930.844465014" observedRunningTime="2026-03-14 08:43:15.901007174 +0000 UTC m=+931.149458831" watchObservedRunningTime="2026-03-14 08:43:15.905810203 +0000 UTC m=+931.154261840" Mar 14 08:43:16 crc kubenswrapper[4886]: I0314 08:43:16.916005 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xvzm"] Mar 14 08:43:16 crc kubenswrapper[4886]: I0314 08:43:16.916605 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2xvzm" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="registry-server" containerID="cri-o://6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806" gracePeriod=2 Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.829790 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.885529 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-utilities\") pod \"e22677e7-de29-404f-89ac-e8b6bb4ad633\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.885629 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-catalog-content\") pod \"e22677e7-de29-404f-89ac-e8b6bb4ad633\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.885696 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f6gd\" (UniqueName: \"kubernetes.io/projected/e22677e7-de29-404f-89ac-e8b6bb4ad633-kube-api-access-5f6gd\") pod \"e22677e7-de29-404f-89ac-e8b6bb4ad633\" (UID: \"e22677e7-de29-404f-89ac-e8b6bb4ad633\") " Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.886484 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-utilities" (OuterVolumeSpecName: "utilities") pod "e22677e7-de29-404f-89ac-e8b6bb4ad633" (UID: "e22677e7-de29-404f-89ac-e8b6bb4ad633"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.891644 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22677e7-de29-404f-89ac-e8b6bb4ad633-kube-api-access-5f6gd" (OuterVolumeSpecName: "kube-api-access-5f6gd") pod "e22677e7-de29-404f-89ac-e8b6bb4ad633" (UID: "e22677e7-de29-404f-89ac-e8b6bb4ad633"). InnerVolumeSpecName "kube-api-access-5f6gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.912130 4886 generic.go:334] "Generic (PLEG): container finished" podID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerID="6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806" exitCode=0 Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.912174 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerDied","Data":"6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806"} Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.912198 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xvzm" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.912212 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xvzm" event={"ID":"e22677e7-de29-404f-89ac-e8b6bb4ad633","Type":"ContainerDied","Data":"518411a9a6332496735e80c481bc582496dd5c035b74224a929aee398e08b440"} Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.912228 4886 scope.go:117] "RemoveContainer" containerID="6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.930668 4886 scope.go:117] "RemoveContainer" containerID="c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.945871 4886 scope.go:117] "RemoveContainer" containerID="7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.964505 4886 scope.go:117] "RemoveContainer" containerID="6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806" Mar 14 08:43:17 crc kubenswrapper[4886]: E0314 08:43:17.964963 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806\": container with ID starting with 6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806 not found: ID does not exist" containerID="6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.965011 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806"} err="failed to get container status \"6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806\": rpc error: code = NotFound desc = could not find container \"6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806\": container with ID starting with 6260f82ef5dc9ceb2fb83775817081652b05583ad275ab2864a6c9cfa2913806 not found: ID does not exist" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.965033 4886 scope.go:117] "RemoveContainer" containerID="c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b" Mar 14 08:43:17 crc kubenswrapper[4886]: E0314 08:43:17.965592 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b\": container with ID starting with c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b not found: ID does not exist" containerID="c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.965643 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b"} err="failed to get container status \"c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b\": rpc error: code = NotFound desc = could not find container \"c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b\": container with ID starting with c722cb2d205de7451795d64bf241bda2bb2d161149f9844ed40c31008e58e14b not found: ID does not exist" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.965675 4886 scope.go:117] "RemoveContainer" containerID="7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296" Mar 14 08:43:17 crc kubenswrapper[4886]: E0314 08:43:17.965993 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296\": container with ID starting with 7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296 not found: ID does not exist" containerID="7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.966054 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296"} err="failed to get container status \"7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296\": rpc error: code = NotFound desc = could not find container \"7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296\": container with ID starting with 7a5082987be1e6bfe64530cc74eb88f6df62e90f8c915164fc08aa4512d37296 not found: ID does not exist" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.986752 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f6gd\" (UniqueName: \"kubernetes.io/projected/e22677e7-de29-404f-89ac-e8b6bb4ad633-kube-api-access-5f6gd\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:17 crc kubenswrapper[4886]: I0314 08:43:17.986776 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:18 crc kubenswrapper[4886]: I0314 08:43:18.177736 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e22677e7-de29-404f-89ac-e8b6bb4ad633" (UID: "e22677e7-de29-404f-89ac-e8b6bb4ad633"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:18 crc kubenswrapper[4886]: I0314 08:43:18.189774 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e22677e7-de29-404f-89ac-e8b6bb4ad633-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:18 crc kubenswrapper[4886]: I0314 08:43:18.245878 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xvzm"] Mar 14 08:43:18 crc kubenswrapper[4886]: I0314 08:43:18.253583 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2xvzm"] Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.182997 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zxg99" Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.427779 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" path="/var/lib/kubelet/pods/e22677e7-de29-404f-89ac-e8b6bb4ad633/volumes" Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.513423 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.513468 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.518492 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.938980 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74c4dcc74d-cltfm" Mar 14 08:43:19 crc kubenswrapper[4886]: I0314 08:43:19.997309 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wmcc2"] Mar 14 08:43:26 crc kubenswrapper[4886]: I0314 08:43:26.065658 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:43:26 crc kubenswrapper[4886]: I0314 08:43:26.066273 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:43:29 crc kubenswrapper[4886]: I0314 08:43:29.475813 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-hjgxb" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.325668 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2595s"] Mar 14 08:43:31 crc kubenswrapper[4886]: E0314 08:43:31.327597 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="extract-content" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.327613 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="extract-content" Mar 14 08:43:31 crc kubenswrapper[4886]: E0314 08:43:31.327637 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="registry-server" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.327644 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="registry-server" Mar 14 08:43:31 crc kubenswrapper[4886]: E0314 08:43:31.327659 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="extract-utilities" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.327665 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="extract-utilities" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.327767 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22677e7-de29-404f-89ac-e8b6bb4ad633" containerName="registry-server" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.328700 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.340711 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2595s"] Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.477831 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-utilities\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.477883 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-catalog-content\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.477945 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569w4\" (UniqueName: \"kubernetes.io/projected/558100ce-8904-4a04-bc8e-f8c906261876-kube-api-access-569w4\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.579436 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569w4\" (UniqueName: \"kubernetes.io/projected/558100ce-8904-4a04-bc8e-f8c906261876-kube-api-access-569w4\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.580392 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-utilities\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.580525 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-catalog-content\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.581144 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-utilities\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.581393 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-catalog-content\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.619077 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569w4\" (UniqueName: \"kubernetes.io/projected/558100ce-8904-4a04-bc8e-f8c906261876-kube-api-access-569w4\") pod \"certified-operators-2595s\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:31 crc kubenswrapper[4886]: I0314 08:43:31.655357 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:32 crc kubenswrapper[4886]: I0314 08:43:32.157052 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2595s"] Mar 14 08:43:33 crc kubenswrapper[4886]: I0314 08:43:33.027610 4886 generic.go:334] "Generic (PLEG): container finished" podID="558100ce-8904-4a04-bc8e-f8c906261876" containerID="09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643" exitCode=0 Mar 14 08:43:33 crc kubenswrapper[4886]: I0314 08:43:33.027699 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerDied","Data":"09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643"} Mar 14 08:43:33 crc kubenswrapper[4886]: I0314 08:43:33.027888 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerStarted","Data":"5ed06821adc5f9e82a2e3f394eaa536284683c56bfefa804ba588491a911839a"} Mar 14 08:43:34 crc kubenswrapper[4886]: I0314 08:43:34.034648 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerStarted","Data":"35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c"} Mar 14 08:43:35 crc kubenswrapper[4886]: I0314 08:43:35.051375 4886 generic.go:334] "Generic (PLEG): container finished" podID="558100ce-8904-4a04-bc8e-f8c906261876" containerID="35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c" exitCode=0 Mar 14 08:43:35 crc kubenswrapper[4886]: I0314 08:43:35.051438 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerDied","Data":"35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c"} Mar 14 08:43:37 crc kubenswrapper[4886]: I0314 08:43:37.379648 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerStarted","Data":"5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21"} Mar 14 08:43:37 crc kubenswrapper[4886]: I0314 08:43:37.402797 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2595s" podStartSLOduration=2.504862458 podStartE2EDuration="6.402773279s" podCreationTimestamp="2026-03-14 08:43:31 +0000 UTC" firstStartedPulling="2026-03-14 08:43:33.029582052 +0000 UTC m=+948.278033729" lastFinishedPulling="2026-03-14 08:43:36.927492913 +0000 UTC m=+952.175944550" observedRunningTime="2026-03-14 08:43:37.397022712 +0000 UTC m=+952.645474349" watchObservedRunningTime="2026-03-14 08:43:37.402773279 +0000 UTC m=+952.651224926" Mar 14 08:43:41 crc kubenswrapper[4886]: I0314 08:43:41.656146 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:41 crc kubenswrapper[4886]: I0314 08:43:41.656640 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:41 crc kubenswrapper[4886]: I0314 08:43:41.704433 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:42 crc kubenswrapper[4886]: I0314 08:43:42.454743 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:42 crc kubenswrapper[4886]: I0314 08:43:42.497046 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2595s"] Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.372384 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl"] Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.374710 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.376759 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.382193 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl"] Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.427849 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2595s" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="registry-server" containerID="cri-o://5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21" gracePeriod=2 Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.522209 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.522531 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.522570 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpp6\" (UniqueName: \"kubernetes.io/projected/2346929a-43ff-4471-a6cd-ff439f1e69f0-kube-api-access-6mpp6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.623426 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpp6\" (UniqueName: \"kubernetes.io/projected/2346929a-43ff-4471-a6cd-ff439f1e69f0-kube-api-access-6mpp6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.623832 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.624302 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.624344 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.624369 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.642113 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpp6\" (UniqueName: \"kubernetes.io/projected/2346929a-43ff-4471-a6cd-ff439f1e69f0-kube-api-access-6mpp6\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.729760 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:44 crc kubenswrapper[4886]: I0314 08:43:44.911543 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl"] Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.048168 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wmcc2" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerName="console" containerID="cri-o://c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf" gracePeriod=15 Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.233719 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.343292 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z9td2"] Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.343527 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="extract-utilities" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.343543 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="extract-utilities" Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.343559 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="registry-server" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.343565 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="registry-server" Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.343576 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="extract-content" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.343582 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="extract-content" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.343681 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="558100ce-8904-4a04-bc8e-f8c906261876" containerName="registry-server" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.344452 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.356206 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9td2"] Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.389702 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wmcc2_a312fb44-823b-44ec-8312-0d83b990e9cd/console/0.log" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.389764 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.433151 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-569w4\" (UniqueName: \"kubernetes.io/projected/558100ce-8904-4a04-bc8e-f8c906261876-kube-api-access-569w4\") pod \"558100ce-8904-4a04-bc8e-f8c906261876\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.433322 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-catalog-content\") pod \"558100ce-8904-4a04-bc8e-f8c906261876\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.434318 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-utilities\") pod \"558100ce-8904-4a04-bc8e-f8c906261876\" (UID: \"558100ce-8904-4a04-bc8e-f8c906261876\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.435597 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-utilities" (OuterVolumeSpecName: "utilities") pod "558100ce-8904-4a04-bc8e-f8c906261876" (UID: "558100ce-8904-4a04-bc8e-f8c906261876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.445655 4886 generic.go:334] "Generic (PLEG): container finished" podID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerID="d240cdcea56a51ae92c764a0d0932979d6ce11c7224b025c7670246ef147f8da" exitCode=0 Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.448737 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wmcc2_a312fb44-823b-44ec-8312-0d83b990e9cd/console/0.log" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.448775 4886 generic.go:334] "Generic (PLEG): container finished" podID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerID="c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf" exitCode=2 Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.448846 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmcc2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.451491 4886 generic.go:334] "Generic (PLEG): container finished" podID="558100ce-8904-4a04-bc8e-f8c906261876" containerID="5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21" exitCode=0 Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.451566 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2595s" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.464406 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558100ce-8904-4a04-bc8e-f8c906261876-kube-api-access-569w4" (OuterVolumeSpecName: "kube-api-access-569w4") pod "558100ce-8904-4a04-bc8e-f8c906261876" (UID: "558100ce-8904-4a04-bc8e-f8c906261876"). InnerVolumeSpecName "kube-api-access-569w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468162 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" event={"ID":"2346929a-43ff-4471-a6cd-ff439f1e69f0","Type":"ContainerDied","Data":"d240cdcea56a51ae92c764a0d0932979d6ce11c7224b025c7670246ef147f8da"} Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468225 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" event={"ID":"2346929a-43ff-4471-a6cd-ff439f1e69f0","Type":"ContainerStarted","Data":"191f8ad08953ddf10ab5cf6098745eb5983b20c4cc2ba8102f75a4d0014997a2"} Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468241 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmcc2" event={"ID":"a312fb44-823b-44ec-8312-0d83b990e9cd","Type":"ContainerDied","Data":"c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf"} Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468261 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmcc2" event={"ID":"a312fb44-823b-44ec-8312-0d83b990e9cd","Type":"ContainerDied","Data":"204c2720ac702bd4600711fcadaaaed3db50ef7659ecebbad48ab4ca3b27e444"} Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468274 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerDied","Data":"5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21"} Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468290 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2595s" event={"ID":"558100ce-8904-4a04-bc8e-f8c906261876","Type":"ContainerDied","Data":"5ed06821adc5f9e82a2e3f394eaa536284683c56bfefa804ba588491a911839a"} Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.468312 4886 scope.go:117] "RemoveContainer" containerID="c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.503916 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "558100ce-8904-4a04-bc8e-f8c906261876" (UID: "558100ce-8904-4a04-bc8e-f8c906261876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.504340 4886 scope.go:117] "RemoveContainer" containerID="c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf" Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.504908 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf\": container with ID starting with c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf not found: ID does not exist" containerID="c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.504951 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf"} err="failed to get container status \"c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf\": rpc error: code = NotFound desc = could not find container \"c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf\": container with ID starting with c3c9da905ea8b8278089229c0e3b9c593a699e890d0586374d16f880e5e0bcbf not found: ID does not exist" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.504979 4886 scope.go:117] "RemoveContainer" containerID="5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535364 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-oauth-config\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535405 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-trusted-ca-bundle\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535436 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-console-config\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535510 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-serving-cert\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535563 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-oauth-serving-cert\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535588 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-service-ca\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535607 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4jmr\" (UniqueName: \"kubernetes.io/projected/a312fb44-823b-44ec-8312-0d83b990e9cd-kube-api-access-t4jmr\") pod \"a312fb44-823b-44ec-8312-0d83b990e9cd\" (UID: \"a312fb44-823b-44ec-8312-0d83b990e9cd\") " Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535750 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj4lc\" (UniqueName: \"kubernetes.io/projected/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-kube-api-access-lj4lc\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535774 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-utilities\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535798 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-catalog-content\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535880 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-569w4\" (UniqueName: \"kubernetes.io/projected/558100ce-8904-4a04-bc8e-f8c906261876-kube-api-access-569w4\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535890 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.535900 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/558100ce-8904-4a04-bc8e-f8c906261876-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.536057 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-console-config" (OuterVolumeSpecName: "console-config") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.536096 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.536426 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.536673 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.537948 4886 scope.go:117] "RemoveContainer" containerID="35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.538460 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a312fb44-823b-44ec-8312-0d83b990e9cd-kube-api-access-t4jmr" (OuterVolumeSpecName: "kube-api-access-t4jmr") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "kube-api-access-t4jmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.538649 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.539790 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a312fb44-823b-44ec-8312-0d83b990e9cd" (UID: "a312fb44-823b-44ec-8312-0d83b990e9cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.612295 4886 scope.go:117] "RemoveContainer" containerID="09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.631135 4886 scope.go:117] "RemoveContainer" containerID="5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21" Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.631704 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21\": container with ID starting with 5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21 not found: ID does not exist" containerID="5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.631770 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21"} err="failed to get container status \"5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21\": rpc error: code = NotFound desc = could not find container \"5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21\": container with ID starting with 5d7484a56c7fb81ed09cd1faefdf0561efc44b3e35b9df2d4f7b0a04e1d5cd21 not found: ID does not exist" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.631803 4886 scope.go:117] "RemoveContainer" containerID="35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c" Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.632288 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c\": container with ID starting with 35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c not found: ID does not exist" containerID="35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.632327 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c"} err="failed to get container status \"35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c\": rpc error: code = NotFound desc = could not find container \"35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c\": container with ID starting with 35822b51e27f31b87f4d2f0cbf873c672a9b38167bcc734531044494d9bf2a4c not found: ID does not exist" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.632353 4886 scope.go:117] "RemoveContainer" containerID="09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643" Mar 14 08:43:45 crc kubenswrapper[4886]: E0314 08:43:45.632800 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643\": container with ID starting with 09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643 not found: ID does not exist" containerID="09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.632833 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643"} err="failed to get container status \"09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643\": rpc error: code = NotFound desc = could not find container \"09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643\": container with ID starting with 09e392e215a61516d6fdfb5ca3a8581840c80b19ffe338b7379666c25b89b643 not found: ID does not exist" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.637696 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj4lc\" (UniqueName: \"kubernetes.io/projected/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-kube-api-access-lj4lc\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.637750 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-utilities\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.637776 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-catalog-content\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.637833 4886 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.637849 4886 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.637860 4886 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.638003 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4jmr\" (UniqueName: \"kubernetes.io/projected/a312fb44-823b-44ec-8312-0d83b990e9cd-kube-api-access-t4jmr\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.638669 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-utilities\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.638714 4886 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a312fb44-823b-44ec-8312-0d83b990e9cd-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.638730 4886 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.638741 4886 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a312fb44-823b-44ec-8312-0d83b990e9cd-console-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.638915 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-catalog-content\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.676281 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj4lc\" (UniqueName: \"kubernetes.io/projected/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-kube-api-access-lj4lc\") pod \"redhat-marketplace-z9td2\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.779088 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2595s"] Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.785027 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2595s"] Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.794064 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wmcc2"] Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.798474 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wmcc2"] Mar 14 08:43:45 crc kubenswrapper[4886]: I0314 08:43:45.962633 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:46 crc kubenswrapper[4886]: I0314 08:43:46.157325 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9td2"] Mar 14 08:43:46 crc kubenswrapper[4886]: W0314 08:43:46.163872 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bbac7c5_fad7_49b1_9819_a61f4a551bf8.slice/crio-6d84d9cf57e9fa638f62d556b5432b88c389fdf60323994e621bae5444b62463 WatchSource:0}: Error finding container 6d84d9cf57e9fa638f62d556b5432b88c389fdf60323994e621bae5444b62463: Status 404 returned error can't find the container with id 6d84d9cf57e9fa638f62d556b5432b88c389fdf60323994e621bae5444b62463 Mar 14 08:43:46 crc kubenswrapper[4886]: I0314 08:43:46.459949 4886 generic.go:334] "Generic (PLEG): container finished" podID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerID="08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18" exitCode=0 Mar 14 08:43:46 crc kubenswrapper[4886]: I0314 08:43:46.460031 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9td2" event={"ID":"8bbac7c5-fad7-49b1-9819-a61f4a551bf8","Type":"ContainerDied","Data":"08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18"} Mar 14 08:43:46 crc kubenswrapper[4886]: I0314 08:43:46.460063 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9td2" event={"ID":"8bbac7c5-fad7-49b1-9819-a61f4a551bf8","Type":"ContainerStarted","Data":"6d84d9cf57e9fa638f62d556b5432b88c389fdf60323994e621bae5444b62463"} Mar 14 08:43:47 crc kubenswrapper[4886]: I0314 08:43:47.437731 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558100ce-8904-4a04-bc8e-f8c906261876" path="/var/lib/kubelet/pods/558100ce-8904-4a04-bc8e-f8c906261876/volumes" Mar 14 08:43:47 crc kubenswrapper[4886]: I0314 08:43:47.439083 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" path="/var/lib/kubelet/pods/a312fb44-823b-44ec-8312-0d83b990e9cd/volumes" Mar 14 08:43:47 crc kubenswrapper[4886]: I0314 08:43:47.471278 4886 generic.go:334] "Generic (PLEG): container finished" podID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerID="c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97" exitCode=0 Mar 14 08:43:47 crc kubenswrapper[4886]: I0314 08:43:47.471377 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9td2" event={"ID":"8bbac7c5-fad7-49b1-9819-a61f4a551bf8","Type":"ContainerDied","Data":"c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97"} Mar 14 08:43:49 crc kubenswrapper[4886]: I0314 08:43:49.484667 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9td2" event={"ID":"8bbac7c5-fad7-49b1-9819-a61f4a551bf8","Type":"ContainerStarted","Data":"0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c"} Mar 14 08:43:49 crc kubenswrapper[4886]: I0314 08:43:49.505314 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z9td2" podStartSLOduration=1.945004065 podStartE2EDuration="4.505294774s" podCreationTimestamp="2026-03-14 08:43:45 +0000 UTC" firstStartedPulling="2026-03-14 08:43:46.461599675 +0000 UTC m=+961.710051312" lastFinishedPulling="2026-03-14 08:43:49.021890384 +0000 UTC m=+964.270342021" observedRunningTime="2026-03-14 08:43:49.50169226 +0000 UTC m=+964.750143897" watchObservedRunningTime="2026-03-14 08:43:49.505294774 +0000 UTC m=+964.753746411" Mar 14 08:43:51 crc kubenswrapper[4886]: I0314 08:43:51.495298 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" event={"ID":"2346929a-43ff-4471-a6cd-ff439f1e69f0","Type":"ContainerStarted","Data":"ffae98bf89ba0158c3ac33d9c1113049815e36a05136776a518c3ebc44aa50d4"} Mar 14 08:43:52 crc kubenswrapper[4886]: I0314 08:43:52.503042 4886 generic.go:334] "Generic (PLEG): container finished" podID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerID="ffae98bf89ba0158c3ac33d9c1113049815e36a05136776a518c3ebc44aa50d4" exitCode=0 Mar 14 08:43:52 crc kubenswrapper[4886]: I0314 08:43:52.503096 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" event={"ID":"2346929a-43ff-4471-a6cd-ff439f1e69f0","Type":"ContainerDied","Data":"ffae98bf89ba0158c3ac33d9c1113049815e36a05136776a518c3ebc44aa50d4"} Mar 14 08:43:53 crc kubenswrapper[4886]: I0314 08:43:53.511225 4886 generic.go:334] "Generic (PLEG): container finished" podID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerID="bfd97833fbfbe57c48557d5c63f705b01134c0971f4b9e98da5fd91efecdbbdb" exitCode=0 Mar 14 08:43:53 crc kubenswrapper[4886]: I0314 08:43:53.511280 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" event={"ID":"2346929a-43ff-4471-a6cd-ff439f1e69f0","Type":"ContainerDied","Data":"bfd97833fbfbe57c48557d5c63f705b01134c0971f4b9e98da5fd91efecdbbdb"} Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.720552 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.916909 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-util\") pod \"2346929a-43ff-4471-a6cd-ff439f1e69f0\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.916960 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mpp6\" (UniqueName: \"kubernetes.io/projected/2346929a-43ff-4471-a6cd-ff439f1e69f0-kube-api-access-6mpp6\") pod \"2346929a-43ff-4471-a6cd-ff439f1e69f0\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.916999 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-bundle\") pod \"2346929a-43ff-4471-a6cd-ff439f1e69f0\" (UID: \"2346929a-43ff-4471-a6cd-ff439f1e69f0\") " Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.918305 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-bundle" (OuterVolumeSpecName: "bundle") pod "2346929a-43ff-4471-a6cd-ff439f1e69f0" (UID: "2346929a-43ff-4471-a6cd-ff439f1e69f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.923307 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2346929a-43ff-4471-a6cd-ff439f1e69f0-kube-api-access-6mpp6" (OuterVolumeSpecName: "kube-api-access-6mpp6") pod "2346929a-43ff-4471-a6cd-ff439f1e69f0" (UID: "2346929a-43ff-4471-a6cd-ff439f1e69f0"). InnerVolumeSpecName "kube-api-access-6mpp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:54 crc kubenswrapper[4886]: I0314 08:43:54.927712 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-util" (OuterVolumeSpecName: "util") pod "2346929a-43ff-4471-a6cd-ff439f1e69f0" (UID: "2346929a-43ff-4471-a6cd-ff439f1e69f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.018873 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mpp6\" (UniqueName: \"kubernetes.io/projected/2346929a-43ff-4471-a6cd-ff439f1e69f0-kube-api-access-6mpp6\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.018904 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.018916 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2346929a-43ff-4471-a6cd-ff439f1e69f0-util\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.524335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" event={"ID":"2346929a-43ff-4471-a6cd-ff439f1e69f0","Type":"ContainerDied","Data":"191f8ad08953ddf10ab5cf6098745eb5983b20c4cc2ba8102f75a4d0014997a2"} Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.524382 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191f8ad08953ddf10ab5cf6098745eb5983b20c4cc2ba8102f75a4d0014997a2" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.524398 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.963606 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:55 crc kubenswrapper[4886]: I0314 08:43:55.963683 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:56 crc kubenswrapper[4886]: I0314 08:43:56.006151 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:56 crc kubenswrapper[4886]: I0314 08:43:56.065997 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:43:56 crc kubenswrapper[4886]: I0314 08:43:56.066105 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:43:56 crc kubenswrapper[4886]: I0314 08:43:56.569655 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:58 crc kubenswrapper[4886]: I0314 08:43:58.330802 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9td2"] Mar 14 08:43:58 crc kubenswrapper[4886]: I0314 08:43:58.542464 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z9td2" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="registry-server" containerID="cri-o://0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c" gracePeriod=2 Mar 14 08:43:58 crc kubenswrapper[4886]: I0314 08:43:58.996360 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.175919 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-utilities\") pod \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.175987 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-catalog-content\") pod \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.176033 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj4lc\" (UniqueName: \"kubernetes.io/projected/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-kube-api-access-lj4lc\") pod \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\" (UID: \"8bbac7c5-fad7-49b1-9819-a61f4a551bf8\") " Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.177066 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-utilities" (OuterVolumeSpecName: "utilities") pod "8bbac7c5-fad7-49b1-9819-a61f4a551bf8" (UID: "8bbac7c5-fad7-49b1-9819-a61f4a551bf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.182416 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-kube-api-access-lj4lc" (OuterVolumeSpecName: "kube-api-access-lj4lc") pod "8bbac7c5-fad7-49b1-9819-a61f4a551bf8" (UID: "8bbac7c5-fad7-49b1-9819-a61f4a551bf8"). InnerVolumeSpecName "kube-api-access-lj4lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.206315 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bbac7c5-fad7-49b1-9819-a61f4a551bf8" (UID: "8bbac7c5-fad7-49b1-9819-a61f4a551bf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.277318 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.277348 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.277385 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj4lc\" (UniqueName: \"kubernetes.io/projected/8bbac7c5-fad7-49b1-9819-a61f4a551bf8-kube-api-access-lj4lc\") on node \"crc\" DevicePath \"\"" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.551206 4886 generic.go:334] "Generic (PLEG): container finished" podID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerID="0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c" exitCode=0 Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.551253 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9td2" event={"ID":"8bbac7c5-fad7-49b1-9819-a61f4a551bf8","Type":"ContainerDied","Data":"0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c"} Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.551285 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9td2" event={"ID":"8bbac7c5-fad7-49b1-9819-a61f4a551bf8","Type":"ContainerDied","Data":"6d84d9cf57e9fa638f62d556b5432b88c389fdf60323994e621bae5444b62463"} Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.551305 4886 scope.go:117] "RemoveContainer" containerID="0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.551448 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9td2" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.580285 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9td2"] Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.583604 4886 scope.go:117] "RemoveContainer" containerID="c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.584167 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9td2"] Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.599974 4886 scope.go:117] "RemoveContainer" containerID="08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.627363 4886 scope.go:117] "RemoveContainer" containerID="0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c" Mar 14 08:43:59 crc kubenswrapper[4886]: E0314 08:43:59.627785 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c\": container with ID starting with 0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c not found: ID does not exist" containerID="0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.627814 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c"} err="failed to get container status \"0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c\": rpc error: code = NotFound desc = could not find container \"0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c\": container with ID starting with 0c43ddd04b983123ffa9186beb2c7d98cb08dfd8fd8836f22789ef3e64a4964c not found: ID does not exist" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.627836 4886 scope.go:117] "RemoveContainer" containerID="c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97" Mar 14 08:43:59 crc kubenswrapper[4886]: E0314 08:43:59.627993 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97\": container with ID starting with c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97 not found: ID does not exist" containerID="c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.628014 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97"} err="failed to get container status \"c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97\": rpc error: code = NotFound desc = could not find container \"c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97\": container with ID starting with c5f7e7db4fd2bf751dd9067ce2f42611448f5f137046262c299b66fe01044f97 not found: ID does not exist" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.628027 4886 scope.go:117] "RemoveContainer" containerID="08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18" Mar 14 08:43:59 crc kubenswrapper[4886]: E0314 08:43:59.628192 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18\": container with ID starting with 08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18 not found: ID does not exist" containerID="08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18" Mar 14 08:43:59 crc kubenswrapper[4886]: I0314 08:43:59.628213 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18"} err="failed to get container status \"08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18\": rpc error: code = NotFound desc = could not find container \"08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18\": container with ID starting with 08bff3466e12ec8e806f917e656abe7ac683111bb44040c1ce9a4ab444ed0c18 not found: ID does not exist" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.206957 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557964-q999d"] Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207527 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="pull" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207545 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="pull" Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207566 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="extract-utilities" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207574 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="extract-utilities" Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207590 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="util" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207598 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="util" Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207607 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="extract" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207615 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="extract" Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207627 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerName="console" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207635 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerName="console" Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207648 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="extract-content" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207656 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="extract-content" Mar 14 08:44:00 crc kubenswrapper[4886]: E0314 08:44:00.207666 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="registry-server" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207673 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="registry-server" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207799 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a312fb44-823b-44ec-8312-0d83b990e9cd" containerName="console" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207818 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" containerName="registry-server" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.207832 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2346929a-43ff-4471-a6cd-ff439f1e69f0" containerName="extract" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.208340 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.210618 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.211326 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.211699 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.215162 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-q999d"] Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.388706 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s449g\" (UniqueName: \"kubernetes.io/projected/7a392902-797e-4aff-a525-34c7aaaf36d2-kube-api-access-s449g\") pod \"auto-csr-approver-29557964-q999d\" (UID: \"7a392902-797e-4aff-a525-34c7aaaf36d2\") " pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.490775 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s449g\" (UniqueName: \"kubernetes.io/projected/7a392902-797e-4aff-a525-34c7aaaf36d2-kube-api-access-s449g\") pod \"auto-csr-approver-29557964-q999d\" (UID: \"7a392902-797e-4aff-a525-34c7aaaf36d2\") " pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.524156 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s449g\" (UniqueName: \"kubernetes.io/projected/7a392902-797e-4aff-a525-34c7aaaf36d2-kube-api-access-s449g\") pod \"auto-csr-approver-29557964-q999d\" (UID: \"7a392902-797e-4aff-a525-34c7aaaf36d2\") " pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:00 crc kubenswrapper[4886]: I0314 08:44:00.823107 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:01 crc kubenswrapper[4886]: I0314 08:44:01.254362 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-q999d"] Mar 14 08:44:01 crc kubenswrapper[4886]: W0314 08:44:01.259667 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a392902_797e_4aff_a525_34c7aaaf36d2.slice/crio-4dbb7f670e4c73d7abb1b66b7efa41f9cf8e214f4b85ffd919ab07081c07561c WatchSource:0}: Error finding container 4dbb7f670e4c73d7abb1b66b7efa41f9cf8e214f4b85ffd919ab07081c07561c: Status 404 returned error can't find the container with id 4dbb7f670e4c73d7abb1b66b7efa41f9cf8e214f4b85ffd919ab07081c07561c Mar 14 08:44:01 crc kubenswrapper[4886]: I0314 08:44:01.427529 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bbac7c5-fad7-49b1-9819-a61f4a551bf8" path="/var/lib/kubelet/pods/8bbac7c5-fad7-49b1-9819-a61f4a551bf8/volumes" Mar 14 08:44:01 crc kubenswrapper[4886]: I0314 08:44:01.567884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-q999d" event={"ID":"7a392902-797e-4aff-a525-34c7aaaf36d2","Type":"ContainerStarted","Data":"4dbb7f670e4c73d7abb1b66b7efa41f9cf8e214f4b85ffd919ab07081c07561c"} Mar 14 08:44:03 crc kubenswrapper[4886]: I0314 08:44:03.581645 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a392902-797e-4aff-a525-34c7aaaf36d2" containerID="26048e0b3b800cf71b02ff1de10775ae3cc126327fcb934d7363c76de88f7810" exitCode=0 Mar 14 08:44:03 crc kubenswrapper[4886]: I0314 08:44:03.581933 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-q999d" event={"ID":"7a392902-797e-4aff-a525-34c7aaaf36d2","Type":"ContainerDied","Data":"26048e0b3b800cf71b02ff1de10775ae3cc126327fcb934d7363c76de88f7810"} Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.843684 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f"] Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.844691 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.846245 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.846464 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.846587 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-w4rmg" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.846713 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.850352 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.858942 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f"] Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.863246 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.987913 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s449g\" (UniqueName: \"kubernetes.io/projected/7a392902-797e-4aff-a525-34c7aaaf36d2-kube-api-access-s449g\") pod \"7a392902-797e-4aff-a525-34c7aaaf36d2\" (UID: \"7a392902-797e-4aff-a525-34c7aaaf36d2\") " Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.988160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69949dc3-8b82-427f-a2f9-35ca7fa9edef-webhook-cert\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.988230 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hwj4\" (UniqueName: \"kubernetes.io/projected/69949dc3-8b82-427f-a2f9-35ca7fa9edef-kube-api-access-5hwj4\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.988286 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69949dc3-8b82-427f-a2f9-35ca7fa9edef-apiservice-cert\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:04 crc kubenswrapper[4886]: I0314 08:44:04.992876 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a392902-797e-4aff-a525-34c7aaaf36d2-kube-api-access-s449g" (OuterVolumeSpecName: "kube-api-access-s449g") pod "7a392902-797e-4aff-a525-34c7aaaf36d2" (UID: "7a392902-797e-4aff-a525-34c7aaaf36d2"). InnerVolumeSpecName "kube-api-access-s449g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.073860 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd"] Mar 14 08:44:05 crc kubenswrapper[4886]: E0314 08:44:05.074109 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a392902-797e-4aff-a525-34c7aaaf36d2" containerName="oc" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.074137 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a392902-797e-4aff-a525-34c7aaaf36d2" containerName="oc" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.074247 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a392902-797e-4aff-a525-34c7aaaf36d2" containerName="oc" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.074700 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.076200 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-f2z2q" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.076583 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.076606 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.089137 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69949dc3-8b82-427f-a2f9-35ca7fa9edef-webhook-cert\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.089215 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hwj4\" (UniqueName: \"kubernetes.io/projected/69949dc3-8b82-427f-a2f9-35ca7fa9edef-kube-api-access-5hwj4\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.089251 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69949dc3-8b82-427f-a2f9-35ca7fa9edef-apiservice-cert\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.089302 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s449g\" (UniqueName: \"kubernetes.io/projected/7a392902-797e-4aff-a525-34c7aaaf36d2-kube-api-access-s449g\") on node \"crc\" DevicePath \"\"" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.096418 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69949dc3-8b82-427f-a2f9-35ca7fa9edef-webhook-cert\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.096968 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69949dc3-8b82-427f-a2f9-35ca7fa9edef-apiservice-cert\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.100357 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd"] Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.128089 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hwj4\" (UniqueName: \"kubernetes.io/projected/69949dc3-8b82-427f-a2f9-35ca7fa9edef-kube-api-access-5hwj4\") pod \"metallb-operator-controller-manager-68fdc74fdb-z8p2f\" (UID: \"69949dc3-8b82-427f-a2f9-35ca7fa9edef\") " pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.173997 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.190780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-webhook-cert\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.190911 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-apiservice-cert\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.191003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxld9\" (UniqueName: \"kubernetes.io/projected/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-kube-api-access-fxld9\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.292470 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-webhook-cert\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.292539 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-apiservice-cert\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.292590 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxld9\" (UniqueName: \"kubernetes.io/projected/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-kube-api-access-fxld9\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.307265 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-apiservice-cert\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.307711 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-webhook-cert\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.315778 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxld9\" (UniqueName: \"kubernetes.io/projected/066d6d6d-3ace-4ed9-9a47-e7accd7645c8-kube-api-access-fxld9\") pod \"metallb-operator-webhook-server-5d77c85fb9-g9wsd\" (UID: \"066d6d6d-3ace-4ed9-9a47-e7accd7645c8\") " pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.388742 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.461562 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f"] Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.603108 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557964-q999d" event={"ID":"7a392902-797e-4aff-a525-34c7aaaf36d2","Type":"ContainerDied","Data":"4dbb7f670e4c73d7abb1b66b7efa41f9cf8e214f4b85ffd919ab07081c07561c"} Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.603171 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbb7f670e4c73d7abb1b66b7efa41f9cf8e214f4b85ffd919ab07081c07561c" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.603234 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557964-q999d" Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.610137 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" event={"ID":"69949dc3-8b82-427f-a2f9-35ca7fa9edef","Type":"ContainerStarted","Data":"4d50ad3e9e2ed717eaac0b99726f072a35eb5cc8a5cf3dd77f3ecf6acd230d21"} Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.766306 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd"] Mar 14 08:44:05 crc kubenswrapper[4886]: W0314 08:44:05.775586 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod066d6d6d_3ace_4ed9_9a47_e7accd7645c8.slice/crio-7e64b384e3f392f7863a8750c46acb80c7314b7d46db2d84535d34f2cd10131b WatchSource:0}: Error finding container 7e64b384e3f392f7863a8750c46acb80c7314b7d46db2d84535d34f2cd10131b: Status 404 returned error can't find the container with id 7e64b384e3f392f7863a8750c46acb80c7314b7d46db2d84535d34f2cd10131b Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.929037 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-8p7xl"] Mar 14 08:44:05 crc kubenswrapper[4886]: I0314 08:44:05.935777 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557958-8p7xl"] Mar 14 08:44:06 crc kubenswrapper[4886]: I0314 08:44:06.616940 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" event={"ID":"066d6d6d-3ace-4ed9-9a47-e7accd7645c8","Type":"ContainerStarted","Data":"7e64b384e3f392f7863a8750c46acb80c7314b7d46db2d84535d34f2cd10131b"} Mar 14 08:44:07 crc kubenswrapper[4886]: I0314 08:44:07.431028 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553eb1d2-6e3f-4bb9-9440-33eda99f5ec4" path="/var/lib/kubelet/pods/553eb1d2-6e3f-4bb9-9440-33eda99f5ec4/volumes" Mar 14 08:44:08 crc kubenswrapper[4886]: I0314 08:44:08.730081 4886 scope.go:117] "RemoveContainer" containerID="aa80faec0fa261e4d6eb9c6f15c0dc7ac436eaf928f475ac030fe28cdd944b3f" Mar 14 08:44:11 crc kubenswrapper[4886]: I0314 08:44:11.663558 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" event={"ID":"066d6d6d-3ace-4ed9-9a47-e7accd7645c8","Type":"ContainerStarted","Data":"ccf2fd8fdca78f03664fc9b5614af1602bf1b7e415dada663226c6f4eb24edec"} Mar 14 08:44:11 crc kubenswrapper[4886]: I0314 08:44:11.663879 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:11 crc kubenswrapper[4886]: I0314 08:44:11.665408 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" event={"ID":"69949dc3-8b82-427f-a2f9-35ca7fa9edef","Type":"ContainerStarted","Data":"8c1c0a6614a7c8c4e8d8d729d6d3804fcffc22979c01e3d382a5d20ff031be40"} Mar 14 08:44:11 crc kubenswrapper[4886]: I0314 08:44:11.665922 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:11 crc kubenswrapper[4886]: I0314 08:44:11.680371 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" podStartSLOduration=1.5016472950000002 podStartE2EDuration="6.680352685s" podCreationTimestamp="2026-03-14 08:44:05 +0000 UTC" firstStartedPulling="2026-03-14 08:44:05.778591205 +0000 UTC m=+981.027042842" lastFinishedPulling="2026-03-14 08:44:10.957296595 +0000 UTC m=+986.205748232" observedRunningTime="2026-03-14 08:44:11.679759298 +0000 UTC m=+986.928210945" watchObservedRunningTime="2026-03-14 08:44:11.680352685 +0000 UTC m=+986.928804322" Mar 14 08:44:11 crc kubenswrapper[4886]: I0314 08:44:11.702433 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" podStartSLOduration=2.56189016 podStartE2EDuration="7.702414961s" podCreationTimestamp="2026-03-14 08:44:04 +0000 UTC" firstStartedPulling="2026-03-14 08:44:05.481195297 +0000 UTC m=+980.729646934" lastFinishedPulling="2026-03-14 08:44:10.621720098 +0000 UTC m=+985.870171735" observedRunningTime="2026-03-14 08:44:11.698445777 +0000 UTC m=+986.946897414" watchObservedRunningTime="2026-03-14 08:44:11.702414961 +0000 UTC m=+986.950866598" Mar 14 08:44:25 crc kubenswrapper[4886]: I0314 08:44:25.394557 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5d77c85fb9-g9wsd" Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.066905 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.067025 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.067088 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.067706 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3f165f2b40174eab0175613b347b88b554cd9e063558e15142f42dfea385fce"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.067773 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://c3f165f2b40174eab0175613b347b88b554cd9e063558e15142f42dfea385fce" gracePeriod=600 Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.746916 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="c3f165f2b40174eab0175613b347b88b554cd9e063558e15142f42dfea385fce" exitCode=0 Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.746936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"c3f165f2b40174eab0175613b347b88b554cd9e063558e15142f42dfea385fce"} Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.747692 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"7738a099ca236f81766457fa9d5c5fb3f046ded018935c8fbb545666a40042f4"} Mar 14 08:44:26 crc kubenswrapper[4886]: I0314 08:44:26.747711 4886 scope.go:117] "RemoveContainer" containerID="f6954217ba64ea552ccd815951f515123e9e534013eb3eb6220b2286262b3047" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.177916 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-68fdc74fdb-z8p2f" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.847304 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-54lvd"] Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.850404 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.857830 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.858048 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.858224 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-k7w9s" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.860491 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls"] Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.869548 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.873401 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.888808 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls"] Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.944682 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8qrlt"] Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.945632 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8qrlt" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.950315 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.950587 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.950767 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qv8p6" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.951010 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.953653 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-qw82c"] Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.956519 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.957934 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.984341 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qw82c"] Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997054 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-sockets\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997108 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-startup\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997153 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrvn\" (UniqueName: \"kubernetes.io/projected/a1dc9df2-9dd0-40af-9508-b65d1047b045-kube-api-access-dvrvn\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997225 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjkh\" (UniqueName: \"kubernetes.io/projected/c90b844b-a223-4b46-93ea-ae5826e1d282-kube-api-access-dbjkh\") pod \"frr-k8s-webhook-server-bcc4b6f68-9qwls\" (UID: \"c90b844b-a223-4b46-93ea-ae5826e1d282\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997266 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-conf\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997321 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1dc9df2-9dd0-40af-9508-b65d1047b045-metrics-certs\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997357 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c90b844b-a223-4b46-93ea-ae5826e1d282-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9qwls\" (UID: \"c90b844b-a223-4b46-93ea-ae5826e1d282\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997422 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-metrics\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:45 crc kubenswrapper[4886]: I0314 08:44:45.997463 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-reloader\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099279 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1dc9df2-9dd0-40af-9508-b65d1047b045-metrics-certs\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099356 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02c262ba-c8d2-4321-858a-eeb91709b8fc-cert\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099390 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qtdj\" (UniqueName: \"kubernetes.io/projected/02c262ba-c8d2-4321-858a-eeb91709b8fc-kube-api-access-8qtdj\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metrics-certs\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099437 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c90b844b-a223-4b46-93ea-ae5826e1d282-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9qwls\" (UID: \"c90b844b-a223-4b46-93ea-ae5826e1d282\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099667 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6wb\" (UniqueName: \"kubernetes.io/projected/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-kube-api-access-kf6wb\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099744 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-metrics\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099844 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-reloader\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099902 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-sockets\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099959 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-startup\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.099976 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrvn\" (UniqueName: \"kubernetes.io/projected/a1dc9df2-9dd0-40af-9508-b65d1047b045-kube-api-access-dvrvn\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100059 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjkh\" (UniqueName: \"kubernetes.io/projected/c90b844b-a223-4b46-93ea-ae5826e1d282-kube-api-access-dbjkh\") pod \"frr-k8s-webhook-server-bcc4b6f68-9qwls\" (UID: \"c90b844b-a223-4b46-93ea-ae5826e1d282\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100087 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-conf\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100194 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metallb-excludel2\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100248 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02c262ba-c8d2-4321-858a-eeb91709b8fc-metrics-certs\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100264 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-reloader\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100329 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-sockets\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100446 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-metrics\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100628 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-conf\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.100817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a1dc9df2-9dd0-40af-9508-b65d1047b045-frr-startup\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.107325 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c90b844b-a223-4b46-93ea-ae5826e1d282-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-9qwls\" (UID: \"c90b844b-a223-4b46-93ea-ae5826e1d282\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.109688 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1dc9df2-9dd0-40af-9508-b65d1047b045-metrics-certs\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.118860 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrvn\" (UniqueName: \"kubernetes.io/projected/a1dc9df2-9dd0-40af-9508-b65d1047b045-kube-api-access-dvrvn\") pod \"frr-k8s-54lvd\" (UID: \"a1dc9df2-9dd0-40af-9508-b65d1047b045\") " pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.119944 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjkh\" (UniqueName: \"kubernetes.io/projected/c90b844b-a223-4b46-93ea-ae5826e1d282-kube-api-access-dbjkh\") pod \"frr-k8s-webhook-server-bcc4b6f68-9qwls\" (UID: \"c90b844b-a223-4b46-93ea-ae5826e1d282\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.170281 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202268 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02c262ba-c8d2-4321-858a-eeb91709b8fc-metrics-certs\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202424 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02c262ba-c8d2-4321-858a-eeb91709b8fc-cert\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202479 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qtdj\" (UniqueName: \"kubernetes.io/projected/02c262ba-c8d2-4321-858a-eeb91709b8fc-kube-api-access-8qtdj\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202514 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metrics-certs\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202559 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6wb\" (UniqueName: \"kubernetes.io/projected/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-kube-api-access-kf6wb\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202621 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.202668 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metallb-excludel2\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: E0314 08:44:46.202864 4886 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 14 08:44:46 crc kubenswrapper[4886]: E0314 08:44:46.202959 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metrics-certs podName:d03e38bf-73ce-44a1-8b84-8e23d2e33a86 nodeName:}" failed. No retries permitted until 2026-03-14 08:44:46.702922941 +0000 UTC m=+1021.951374748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metrics-certs") pod "speaker-8qrlt" (UID: "d03e38bf-73ce-44a1-8b84-8e23d2e33a86") : secret "speaker-certs-secret" not found Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.203829 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metallb-excludel2\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: E0314 08:44:46.203910 4886 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 08:44:46 crc kubenswrapper[4886]: E0314 08:44:46.203966 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist podName:d03e38bf-73ce-44a1-8b84-8e23d2e33a86 nodeName:}" failed. No retries permitted until 2026-03-14 08:44:46.703950611 +0000 UTC m=+1021.952402468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist") pod "speaker-8qrlt" (UID: "d03e38bf-73ce-44a1-8b84-8e23d2e33a86") : secret "metallb-memberlist" not found Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.204056 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.208765 4886 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.210992 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02c262ba-c8d2-4321-858a-eeb91709b8fc-metrics-certs\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.219495 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02c262ba-c8d2-4321-858a-eeb91709b8fc-cert\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.225681 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qtdj\" (UniqueName: \"kubernetes.io/projected/02c262ba-c8d2-4321-858a-eeb91709b8fc-kube-api-access-8qtdj\") pod \"controller-7bb4cc7c98-qw82c\" (UID: \"02c262ba-c8d2-4321-858a-eeb91709b8fc\") " pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.225734 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6wb\" (UniqueName: \"kubernetes.io/projected/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-kube-api-access-kf6wb\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.277341 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.487163 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qw82c"] Mar 14 08:44:46 crc kubenswrapper[4886]: W0314 08:44:46.491620 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c262ba_c8d2_4321_858a_eeb91709b8fc.slice/crio-065369b9c2c719986d2ecbe98ec3b12a1f0c873405b2812b09722de3cb7231e9 WatchSource:0}: Error finding container 065369b9c2c719986d2ecbe98ec3b12a1f0c873405b2812b09722de3cb7231e9: Status 404 returned error can't find the container with id 065369b9c2c719986d2ecbe98ec3b12a1f0c873405b2812b09722de3cb7231e9 Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.646539 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls"] Mar 14 08:44:46 crc kubenswrapper[4886]: W0314 08:44:46.653817 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90b844b_a223_4b46_93ea_ae5826e1d282.slice/crio-8dfe2930884bc5347eb118413c60d8a5a64445661ba6947845b2c76877342d5c WatchSource:0}: Error finding container 8dfe2930884bc5347eb118413c60d8a5a64445661ba6947845b2c76877342d5c: Status 404 returned error can't find the container with id 8dfe2930884bc5347eb118413c60d8a5a64445661ba6947845b2c76877342d5c Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.709269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.709379 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metrics-certs\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: E0314 08:44:46.709487 4886 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 14 08:44:46 crc kubenswrapper[4886]: E0314 08:44:46.709576 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist podName:d03e38bf-73ce-44a1-8b84-8e23d2e33a86 nodeName:}" failed. No retries permitted until 2026-03-14 08:44:47.709552567 +0000 UTC m=+1022.958004264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist") pod "speaker-8qrlt" (UID: "d03e38bf-73ce-44a1-8b84-8e23d2e33a86") : secret "metallb-memberlist" not found Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.714790 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-metrics-certs\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.905533 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"3d68201f385791892f921c78f7415d584fd1b2dd72823764a30835abab53b042"} Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.906433 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" event={"ID":"c90b844b-a223-4b46-93ea-ae5826e1d282","Type":"ContainerStarted","Data":"8dfe2930884bc5347eb118413c60d8a5a64445661ba6947845b2c76877342d5c"} Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.909139 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qw82c" event={"ID":"02c262ba-c8d2-4321-858a-eeb91709b8fc","Type":"ContainerStarted","Data":"a5fb8e22d0e3bb5d14cbddb54e9be237ff6de40793f91112aaa8db8dc06f9eb2"} Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.909186 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qw82c" event={"ID":"02c262ba-c8d2-4321-858a-eeb91709b8fc","Type":"ContainerStarted","Data":"7e65d957c2499d06b17c11b9586500db7b801d4651092b335d0a6d004e9b022d"} Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.909200 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qw82c" event={"ID":"02c262ba-c8d2-4321-858a-eeb91709b8fc","Type":"ContainerStarted","Data":"065369b9c2c719986d2ecbe98ec3b12a1f0c873405b2812b09722de3cb7231e9"} Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.909293 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:46 crc kubenswrapper[4886]: I0314 08:44:46.926070 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-qw82c" podStartSLOduration=1.926048803 podStartE2EDuration="1.926048803s" podCreationTimestamp="2026-03-14 08:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:44:46.922903193 +0000 UTC m=+1022.171354860" watchObservedRunningTime="2026-03-14 08:44:46.926048803 +0000 UTC m=+1022.174500440" Mar 14 08:44:47 crc kubenswrapper[4886]: I0314 08:44:47.723619 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:47 crc kubenswrapper[4886]: I0314 08:44:47.728073 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d03e38bf-73ce-44a1-8b84-8e23d2e33a86-memberlist\") pod \"speaker-8qrlt\" (UID: \"d03e38bf-73ce-44a1-8b84-8e23d2e33a86\") " pod="metallb-system/speaker-8qrlt" Mar 14 08:44:47 crc kubenswrapper[4886]: I0314 08:44:47.769965 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8qrlt" Mar 14 08:44:47 crc kubenswrapper[4886]: I0314 08:44:47.950555 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8qrlt" event={"ID":"d03e38bf-73ce-44a1-8b84-8e23d2e33a86","Type":"ContainerStarted","Data":"d9692ed5042f67b89e7da58af3673cbe25155335a4bb7e7ca87c8195a5f72c7c"} Mar 14 08:44:48 crc kubenswrapper[4886]: I0314 08:44:48.971483 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8qrlt" event={"ID":"d03e38bf-73ce-44a1-8b84-8e23d2e33a86","Type":"ContainerStarted","Data":"1111de7139465d65b87bdc5474fc3ef51330079fde3968ba133ced87067f7006"} Mar 14 08:44:48 crc kubenswrapper[4886]: I0314 08:44:48.971828 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8qrlt" event={"ID":"d03e38bf-73ce-44a1-8b84-8e23d2e33a86","Type":"ContainerStarted","Data":"6bb5e57024cdef39abb1507b6ac38570ad1ac1273ef3a82e9398b58f82d437b7"} Mar 14 08:44:48 crc kubenswrapper[4886]: I0314 08:44:48.971848 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8qrlt" Mar 14 08:44:48 crc kubenswrapper[4886]: I0314 08:44:48.991860 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8qrlt" podStartSLOduration=3.991841646 podStartE2EDuration="3.991841646s" podCreationTimestamp="2026-03-14 08:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:44:48.98851281 +0000 UTC m=+1024.236964467" watchObservedRunningTime="2026-03-14 08:44:48.991841646 +0000 UTC m=+1024.240293283" Mar 14 08:44:54 crc kubenswrapper[4886]: I0314 08:44:54.002981 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" event={"ID":"c90b844b-a223-4b46-93ea-ae5826e1d282","Type":"ContainerStarted","Data":"233a8240342e0b7c239b8d3ceb723984dfad241abc43c6e7cbb22577e3f76e57"} Mar 14 08:44:54 crc kubenswrapper[4886]: I0314 08:44:54.004414 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1dc9df2-9dd0-40af-9508-b65d1047b045" containerID="2d6843a8f22f96128d8ee0b821af0f0bbaa989475f1524a81db4bb2651ca0f0e" exitCode=0 Mar 14 08:44:54 crc kubenswrapper[4886]: I0314 08:44:54.004720 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:44:54 crc kubenswrapper[4886]: I0314 08:44:54.004867 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerDied","Data":"2d6843a8f22f96128d8ee0b821af0f0bbaa989475f1524a81db4bb2651ca0f0e"} Mar 14 08:44:54 crc kubenswrapper[4886]: I0314 08:44:54.023649 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" podStartSLOduration=2.382165033 podStartE2EDuration="9.023597542s" podCreationTimestamp="2026-03-14 08:44:45 +0000 UTC" firstStartedPulling="2026-03-14 08:44:46.665809296 +0000 UTC m=+1021.914260933" lastFinishedPulling="2026-03-14 08:44:53.307241805 +0000 UTC m=+1028.555693442" observedRunningTime="2026-03-14 08:44:54.021616765 +0000 UTC m=+1029.270068412" watchObservedRunningTime="2026-03-14 08:44:54.023597542 +0000 UTC m=+1029.272049199" Mar 14 08:44:55 crc kubenswrapper[4886]: I0314 08:44:55.012785 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1dc9df2-9dd0-40af-9508-b65d1047b045" containerID="4791bcc9607116e837ee160963f9546a1d5a1ef212b22ab0bce2ba96a4114a76" exitCode=0 Mar 14 08:44:55 crc kubenswrapper[4886]: I0314 08:44:55.012902 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerDied","Data":"4791bcc9607116e837ee160963f9546a1d5a1ef212b22ab0bce2ba96a4114a76"} Mar 14 08:44:56 crc kubenswrapper[4886]: I0314 08:44:56.020577 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1dc9df2-9dd0-40af-9508-b65d1047b045" containerID="a9e846a00a887e90d9a637ebc54dbe6b89c8a8da441f292c6d6eb66bf08547ad" exitCode=0 Mar 14 08:44:56 crc kubenswrapper[4886]: I0314 08:44:56.020775 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerDied","Data":"a9e846a00a887e90d9a637ebc54dbe6b89c8a8da441f292c6d6eb66bf08547ad"} Mar 14 08:44:56 crc kubenswrapper[4886]: I0314 08:44:56.280571 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-qw82c" Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.040221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"9b626e2e8e34bcf978b62bd8291683ae6cb6375b0ee4bac2ef41d0235993bcdd"} Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.041458 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-54lvd" Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.041550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"8b04c2d7457a78b4f890d3a4f4ace33c645f8d7ceb00de2d6002f01bcdad27e8"} Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.041629 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"a638b4989593b17b22ba5901d02647cba039b5e4dbaf808a1c867386cd8f4598"} Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.041707 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"3b5f9a4558c91436d59a3a9917642e056f5f1918ecf98e2feba9849b980fc031"} Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.041774 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"ac814a1017c79ed4228923f11dedc1f97128198b0603ac79471a9009e074f392"} Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.041855 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-54lvd" event={"ID":"a1dc9df2-9dd0-40af-9508-b65d1047b045","Type":"ContainerStarted","Data":"8c9a3c297f91deaa8d860a8f6baecfd3a4a14ce637503827a830c48019b5804c"} Mar 14 08:44:57 crc kubenswrapper[4886]: I0314 08:44:57.064694 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-54lvd" podStartSLOduration=5.095625073 podStartE2EDuration="12.064676261s" podCreationTimestamp="2026-03-14 08:44:45 +0000 UTC" firstStartedPulling="2026-03-14 08:44:46.36222436 +0000 UTC m=+1021.610676007" lastFinishedPulling="2026-03-14 08:44:53.331275558 +0000 UTC m=+1028.579727195" observedRunningTime="2026-03-14 08:44:57.060944003 +0000 UTC m=+1032.309395650" watchObservedRunningTime="2026-03-14 08:44:57.064676261 +0000 UTC m=+1032.313127918" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.128969 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2"] Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.130050 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.132265 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.133760 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.141602 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2"] Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.223325 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clhr2\" (UniqueName: \"kubernetes.io/projected/eecaac41-bd76-496a-bf1c-61c2ee287386-kube-api-access-clhr2\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.223742 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecaac41-bd76-496a-bf1c-61c2ee287386-config-volume\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.223761 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecaac41-bd76-496a-bf1c-61c2ee287386-secret-volume\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.326072 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecaac41-bd76-496a-bf1c-61c2ee287386-config-volume\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.326165 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecaac41-bd76-496a-bf1c-61c2ee287386-secret-volume\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.326350 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clhr2\" (UniqueName: \"kubernetes.io/projected/eecaac41-bd76-496a-bf1c-61c2ee287386-kube-api-access-clhr2\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.328307 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecaac41-bd76-496a-bf1c-61c2ee287386-config-volume\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.349159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecaac41-bd76-496a-bf1c-61c2ee287386-secret-volume\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.353847 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clhr2\" (UniqueName: \"kubernetes.io/projected/eecaac41-bd76-496a-bf1c-61c2ee287386-kube-api-access-clhr2\") pod \"collect-profiles-29557965-8h4j2\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:00 crc kubenswrapper[4886]: I0314 08:45:00.464981 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:01 crc kubenswrapper[4886]: I0314 08:45:01.171962 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-54lvd" Mar 14 08:45:01 crc kubenswrapper[4886]: I0314 08:45:01.173964 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2"] Mar 14 08:45:01 crc kubenswrapper[4886]: I0314 08:45:01.215790 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-54lvd" Mar 14 08:45:02 crc kubenswrapper[4886]: I0314 08:45:02.149437 4886 generic.go:334] "Generic (PLEG): container finished" podID="eecaac41-bd76-496a-bf1c-61c2ee287386" containerID="a41ddcda531dba69cb7ba0aba03a52c9f225efc67069e6e4218361bf4e474a2a" exitCode=0 Mar 14 08:45:02 crc kubenswrapper[4886]: I0314 08:45:02.149500 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" event={"ID":"eecaac41-bd76-496a-bf1c-61c2ee287386","Type":"ContainerDied","Data":"a41ddcda531dba69cb7ba0aba03a52c9f225efc67069e6e4218361bf4e474a2a"} Mar 14 08:45:02 crc kubenswrapper[4886]: I0314 08:45:02.149558 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" event={"ID":"eecaac41-bd76-496a-bf1c-61c2ee287386","Type":"ContainerStarted","Data":"174c13700f0a4c5e823a91ae77bbe9b104f79e010c85bde7c553beac79bbb155"} Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.495934 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.530357 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecaac41-bd76-496a-bf1c-61c2ee287386-secret-volume\") pod \"eecaac41-bd76-496a-bf1c-61c2ee287386\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.530476 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clhr2\" (UniqueName: \"kubernetes.io/projected/eecaac41-bd76-496a-bf1c-61c2ee287386-kube-api-access-clhr2\") pod \"eecaac41-bd76-496a-bf1c-61c2ee287386\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.530515 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecaac41-bd76-496a-bf1c-61c2ee287386-config-volume\") pod \"eecaac41-bd76-496a-bf1c-61c2ee287386\" (UID: \"eecaac41-bd76-496a-bf1c-61c2ee287386\") " Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.531426 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eecaac41-bd76-496a-bf1c-61c2ee287386-config-volume" (OuterVolumeSpecName: "config-volume") pod "eecaac41-bd76-496a-bf1c-61c2ee287386" (UID: "eecaac41-bd76-496a-bf1c-61c2ee287386"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.542359 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecaac41-bd76-496a-bf1c-61c2ee287386-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eecaac41-bd76-496a-bf1c-61c2ee287386" (UID: "eecaac41-bd76-496a-bf1c-61c2ee287386"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.548890 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecaac41-bd76-496a-bf1c-61c2ee287386-kube-api-access-clhr2" (OuterVolumeSpecName: "kube-api-access-clhr2") pod "eecaac41-bd76-496a-bf1c-61c2ee287386" (UID: "eecaac41-bd76-496a-bf1c-61c2ee287386"). InnerVolumeSpecName "kube-api-access-clhr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.632330 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eecaac41-bd76-496a-bf1c-61c2ee287386-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.632380 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clhr2\" (UniqueName: \"kubernetes.io/projected/eecaac41-bd76-496a-bf1c-61c2ee287386-kube-api-access-clhr2\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:03 crc kubenswrapper[4886]: I0314 08:45:03.632394 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eecaac41-bd76-496a-bf1c-61c2ee287386-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:04 crc kubenswrapper[4886]: I0314 08:45:04.162358 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" event={"ID":"eecaac41-bd76-496a-bf1c-61c2ee287386","Type":"ContainerDied","Data":"174c13700f0a4c5e823a91ae77bbe9b104f79e010c85bde7c553beac79bbb155"} Mar 14 08:45:04 crc kubenswrapper[4886]: I0314 08:45:04.162397 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="174c13700f0a4c5e823a91ae77bbe9b104f79e010c85bde7c553beac79bbb155" Mar 14 08:45:04 crc kubenswrapper[4886]: I0314 08:45:04.162515 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2" Mar 14 08:45:06 crc kubenswrapper[4886]: I0314 08:45:06.175546 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-54lvd" Mar 14 08:45:06 crc kubenswrapper[4886]: I0314 08:45:06.211092 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-9qwls" Mar 14 08:45:07 crc kubenswrapper[4886]: I0314 08:45:07.774318 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8qrlt" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.376387 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dzvjs"] Mar 14 08:45:10 crc kubenswrapper[4886]: E0314 08:45:10.377009 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecaac41-bd76-496a-bf1c-61c2ee287386" containerName="collect-profiles" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.377028 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecaac41-bd76-496a-bf1c-61c2ee287386" containerName="collect-profiles" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.377183 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="eecaac41-bd76-496a-bf1c-61c2ee287386" containerName="collect-profiles" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.377758 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.382463 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zj4b4" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.382661 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.383244 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.396266 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dzvjs"] Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.522788 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97p7d\" (UniqueName: \"kubernetes.io/projected/cd14212b-3889-4546-8c64-66097c14ff0e-kube-api-access-97p7d\") pod \"openstack-operator-index-dzvjs\" (UID: \"cd14212b-3889-4546-8c64-66097c14ff0e\") " pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.624498 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97p7d\" (UniqueName: \"kubernetes.io/projected/cd14212b-3889-4546-8c64-66097c14ff0e-kube-api-access-97p7d\") pod \"openstack-operator-index-dzvjs\" (UID: \"cd14212b-3889-4546-8c64-66097c14ff0e\") " pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.647060 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97p7d\" (UniqueName: \"kubernetes.io/projected/cd14212b-3889-4546-8c64-66097c14ff0e-kube-api-access-97p7d\") pod \"openstack-operator-index-dzvjs\" (UID: \"cd14212b-3889-4546-8c64-66097c14ff0e\") " pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:10 crc kubenswrapper[4886]: I0314 08:45:10.707921 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:11 crc kubenswrapper[4886]: I0314 08:45:11.121351 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dzvjs"] Mar 14 08:45:11 crc kubenswrapper[4886]: I0314 08:45:11.209668 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzvjs" event={"ID":"cd14212b-3889-4546-8c64-66097c14ff0e","Type":"ContainerStarted","Data":"5fe398d665d640daa557fc838d6a4bf911ab4ccd4e2d42283e66c13b02336a14"} Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.147226 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dzvjs"] Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.222980 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzvjs" event={"ID":"cd14212b-3889-4546-8c64-66097c14ff0e","Type":"ContainerStarted","Data":"2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb"} Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.239498 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dzvjs" podStartSLOduration=1.364246477 podStartE2EDuration="3.23947893s" podCreationTimestamp="2026-03-14 08:45:10 +0000 UTC" firstStartedPulling="2026-03-14 08:45:11.136299511 +0000 UTC m=+1046.384751148" lastFinishedPulling="2026-03-14 08:45:13.011531964 +0000 UTC m=+1048.259983601" observedRunningTime="2026-03-14 08:45:13.238470661 +0000 UTC m=+1048.486922298" watchObservedRunningTime="2026-03-14 08:45:13.23947893 +0000 UTC m=+1048.487930567" Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.761028 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tnc57"] Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.762267 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.767721 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tnc57"] Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.872583 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmkp\" (UniqueName: \"kubernetes.io/projected/0941dd67-4b9c-4337-b9b1-fe329f6c22fc-kube-api-access-bbmkp\") pod \"openstack-operator-index-tnc57\" (UID: \"0941dd67-4b9c-4337-b9b1-fe329f6c22fc\") " pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.974838 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmkp\" (UniqueName: \"kubernetes.io/projected/0941dd67-4b9c-4337-b9b1-fe329f6c22fc-kube-api-access-bbmkp\") pod \"openstack-operator-index-tnc57\" (UID: \"0941dd67-4b9c-4337-b9b1-fe329f6c22fc\") " pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:13 crc kubenswrapper[4886]: I0314 08:45:13.992542 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmkp\" (UniqueName: \"kubernetes.io/projected/0941dd67-4b9c-4337-b9b1-fe329f6c22fc-kube-api-access-bbmkp\") pod \"openstack-operator-index-tnc57\" (UID: \"0941dd67-4b9c-4337-b9b1-fe329f6c22fc\") " pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.079055 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.233391 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dzvjs" podUID="cd14212b-3889-4546-8c64-66097c14ff0e" containerName="registry-server" containerID="cri-o://2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb" gracePeriod=2 Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.488180 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tnc57"] Mar 14 08:45:14 crc kubenswrapper[4886]: W0314 08:45:14.497407 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0941dd67_4b9c_4337_b9b1_fe329f6c22fc.slice/crio-ce25bde2452c18ed441bdffda0665ad48e7678bf2a9135e5174f4048d2724892 WatchSource:0}: Error finding container ce25bde2452c18ed441bdffda0665ad48e7678bf2a9135e5174f4048d2724892: Status 404 returned error can't find the container with id ce25bde2452c18ed441bdffda0665ad48e7678bf2a9135e5174f4048d2724892 Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.669009 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.787862 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97p7d\" (UniqueName: \"kubernetes.io/projected/cd14212b-3889-4546-8c64-66097c14ff0e-kube-api-access-97p7d\") pod \"cd14212b-3889-4546-8c64-66097c14ff0e\" (UID: \"cd14212b-3889-4546-8c64-66097c14ff0e\") " Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.793527 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd14212b-3889-4546-8c64-66097c14ff0e-kube-api-access-97p7d" (OuterVolumeSpecName: "kube-api-access-97p7d") pod "cd14212b-3889-4546-8c64-66097c14ff0e" (UID: "cd14212b-3889-4546-8c64-66097c14ff0e"). InnerVolumeSpecName "kube-api-access-97p7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:45:14 crc kubenswrapper[4886]: I0314 08:45:14.890475 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97p7d\" (UniqueName: \"kubernetes.io/projected/cd14212b-3889-4546-8c64-66097c14ff0e-kube-api-access-97p7d\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.241962 4886 generic.go:334] "Generic (PLEG): container finished" podID="cd14212b-3889-4546-8c64-66097c14ff0e" containerID="2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb" exitCode=0 Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.242025 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dzvjs" Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.242046 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzvjs" event={"ID":"cd14212b-3889-4546-8c64-66097c14ff0e","Type":"ContainerDied","Data":"2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb"} Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.244557 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dzvjs" event={"ID":"cd14212b-3889-4546-8c64-66097c14ff0e","Type":"ContainerDied","Data":"5fe398d665d640daa557fc838d6a4bf911ab4ccd4e2d42283e66c13b02336a14"} Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.244599 4886 scope.go:117] "RemoveContainer" containerID="2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb" Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.244813 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tnc57" event={"ID":"0941dd67-4b9c-4337-b9b1-fe329f6c22fc","Type":"ContainerStarted","Data":"cd96842e0e2f04f260bbae43c9938473dce825f5b654a6517484c63e0bb538e5"} Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.244840 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tnc57" event={"ID":"0941dd67-4b9c-4337-b9b1-fe329f6c22fc","Type":"ContainerStarted","Data":"ce25bde2452c18ed441bdffda0665ad48e7678bf2a9135e5174f4048d2724892"} Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.264335 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tnc57" podStartSLOduration=2.216316349 podStartE2EDuration="2.264316532s" podCreationTimestamp="2026-03-14 08:45:13 +0000 UTC" firstStartedPulling="2026-03-14 08:45:14.501684791 +0000 UTC m=+1049.750136428" lastFinishedPulling="2026-03-14 08:45:14.549684974 +0000 UTC m=+1049.798136611" observedRunningTime="2026-03-14 08:45:15.263070456 +0000 UTC m=+1050.511522093" watchObservedRunningTime="2026-03-14 08:45:15.264316532 +0000 UTC m=+1050.512768169" Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.268743 4886 scope.go:117] "RemoveContainer" containerID="2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb" Mar 14 08:45:15 crc kubenswrapper[4886]: E0314 08:45:15.269426 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb\": container with ID starting with 2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb not found: ID does not exist" containerID="2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb" Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.269465 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb"} err="failed to get container status \"2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb\": rpc error: code = NotFound desc = could not find container \"2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb\": container with ID starting with 2ed82bfdf5e7174a190e16551fffa67c9989f2e0c81eb7273c520ec6d0792aeb not found: ID does not exist" Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.285445 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dzvjs"] Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.289833 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dzvjs"] Mar 14 08:45:15 crc kubenswrapper[4886]: I0314 08:45:15.429016 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd14212b-3889-4546-8c64-66097c14ff0e" path="/var/lib/kubelet/pods/cd14212b-3889-4546-8c64-66097c14ff0e/volumes" Mar 14 08:45:24 crc kubenswrapper[4886]: I0314 08:45:24.079810 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:24 crc kubenswrapper[4886]: I0314 08:45:24.081628 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:24 crc kubenswrapper[4886]: I0314 08:45:24.115087 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:24 crc kubenswrapper[4886]: I0314 08:45:24.345543 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tnc57" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.200109 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g"] Mar 14 08:45:25 crc kubenswrapper[4886]: E0314 08:45:25.200490 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd14212b-3889-4546-8c64-66097c14ff0e" containerName="registry-server" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.200511 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd14212b-3889-4546-8c64-66097c14ff0e" containerName="registry-server" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.200807 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd14212b-3889-4546-8c64-66097c14ff0e" containerName="registry-server" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.202292 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.204629 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2kxnn" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.208631 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g"] Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.385560 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-bundle\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.385636 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zr6g\" (UniqueName: \"kubernetes.io/projected/4489283e-7c08-4b30-9efe-0d700167a4de-kube-api-access-5zr6g\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.385695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-util\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.487523 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-bundle\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.488065 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zr6g\" (UniqueName: \"kubernetes.io/projected/4489283e-7c08-4b30-9efe-0d700167a4de-kube-api-access-5zr6g\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.488211 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-util\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.488095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-bundle\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.488552 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-util\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.508149 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zr6g\" (UniqueName: \"kubernetes.io/projected/4489283e-7c08-4b30-9efe-0d700167a4de-kube-api-access-5zr6g\") pod \"0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:25 crc kubenswrapper[4886]: I0314 08:45:25.565422 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:26 crc kubenswrapper[4886]: I0314 08:45:26.030809 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g"] Mar 14 08:45:26 crc kubenswrapper[4886]: W0314 08:45:26.037576 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4489283e_7c08_4b30_9efe_0d700167a4de.slice/crio-48b3f9c4e343e23ad993158d80446af75d9beb89fa6ae9b30c7c2c3dd437523d WatchSource:0}: Error finding container 48b3f9c4e343e23ad993158d80446af75d9beb89fa6ae9b30c7c2c3dd437523d: Status 404 returned error can't find the container with id 48b3f9c4e343e23ad993158d80446af75d9beb89fa6ae9b30c7c2c3dd437523d Mar 14 08:45:26 crc kubenswrapper[4886]: I0314 08:45:26.517849 4886 generic.go:334] "Generic (PLEG): container finished" podID="4489283e-7c08-4b30-9efe-0d700167a4de" containerID="ed48f196b1116d4b55832ee25be54ed04bd10112d0af6c0e1baa84364241984a" exitCode=0 Mar 14 08:45:26 crc kubenswrapper[4886]: I0314 08:45:26.519354 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" event={"ID":"4489283e-7c08-4b30-9efe-0d700167a4de","Type":"ContainerDied","Data":"ed48f196b1116d4b55832ee25be54ed04bd10112d0af6c0e1baa84364241984a"} Mar 14 08:45:26 crc kubenswrapper[4886]: I0314 08:45:26.519389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" event={"ID":"4489283e-7c08-4b30-9efe-0d700167a4de","Type":"ContainerStarted","Data":"48b3f9c4e343e23ad993158d80446af75d9beb89fa6ae9b30c7c2c3dd437523d"} Mar 14 08:45:27 crc kubenswrapper[4886]: I0314 08:45:27.528492 4886 generic.go:334] "Generic (PLEG): container finished" podID="4489283e-7c08-4b30-9efe-0d700167a4de" containerID="4f8dcb31fa60d0a6f8586c3808a6cd815e3ef862574f366baf6f653ea031c310" exitCode=0 Mar 14 08:45:27 crc kubenswrapper[4886]: I0314 08:45:27.528663 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" event={"ID":"4489283e-7c08-4b30-9efe-0d700167a4de","Type":"ContainerDied","Data":"4f8dcb31fa60d0a6f8586c3808a6cd815e3ef862574f366baf6f653ea031c310"} Mar 14 08:45:29 crc kubenswrapper[4886]: I0314 08:45:29.542184 4886 generic.go:334] "Generic (PLEG): container finished" podID="4489283e-7c08-4b30-9efe-0d700167a4de" containerID="fb95614605a35ec5fb73da65cf883332f55608350795171fd52f58c613215a0b" exitCode=0 Mar 14 08:45:29 crc kubenswrapper[4886]: I0314 08:45:29.542260 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" event={"ID":"4489283e-7c08-4b30-9efe-0d700167a4de","Type":"ContainerDied","Data":"fb95614605a35ec5fb73da65cf883332f55608350795171fd52f58c613215a0b"} Mar 14 08:45:31 crc kubenswrapper[4886]: I0314 08:45:31.991354 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.165058 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-util\") pod \"4489283e-7c08-4b30-9efe-0d700167a4de\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.165253 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-bundle\") pod \"4489283e-7c08-4b30-9efe-0d700167a4de\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.165415 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zr6g\" (UniqueName: \"kubernetes.io/projected/4489283e-7c08-4b30-9efe-0d700167a4de-kube-api-access-5zr6g\") pod \"4489283e-7c08-4b30-9efe-0d700167a4de\" (UID: \"4489283e-7c08-4b30-9efe-0d700167a4de\") " Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.165887 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-bundle" (OuterVolumeSpecName: "bundle") pod "4489283e-7c08-4b30-9efe-0d700167a4de" (UID: "4489283e-7c08-4b30-9efe-0d700167a4de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.171867 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4489283e-7c08-4b30-9efe-0d700167a4de-kube-api-access-5zr6g" (OuterVolumeSpecName: "kube-api-access-5zr6g") pod "4489283e-7c08-4b30-9efe-0d700167a4de" (UID: "4489283e-7c08-4b30-9efe-0d700167a4de"). InnerVolumeSpecName "kube-api-access-5zr6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.197360 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-util" (OuterVolumeSpecName: "util") pod "4489283e-7c08-4b30-9efe-0d700167a4de" (UID: "4489283e-7c08-4b30-9efe-0d700167a4de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.266581 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zr6g\" (UniqueName: \"kubernetes.io/projected/4489283e-7c08-4b30-9efe-0d700167a4de-kube-api-access-5zr6g\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.266618 4886 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-util\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.266627 4886 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4489283e-7c08-4b30-9efe-0d700167a4de-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.802025 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" event={"ID":"4489283e-7c08-4b30-9efe-0d700167a4de","Type":"ContainerDied","Data":"48b3f9c4e343e23ad993158d80446af75d9beb89fa6ae9b30c7c2c3dd437523d"} Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.802385 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b3f9c4e343e23ad993158d80446af75d9beb89fa6ae9b30c7c2c3dd437523d" Mar 14 08:45:32 crc kubenswrapper[4886]: I0314 08:45:32.802113 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.390310 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9"] Mar 14 08:45:37 crc kubenswrapper[4886]: E0314 08:45:37.391767 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="extract" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.391840 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="extract" Mar 14 08:45:37 crc kubenswrapper[4886]: E0314 08:45:37.391911 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="pull" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.391967 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="pull" Mar 14 08:45:37 crc kubenswrapper[4886]: E0314 08:45:37.392024 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="util" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.392081 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="util" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.392260 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4489283e-7c08-4b30-9efe-0d700167a4de" containerName="extract" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.392779 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.398600 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-l6pj2" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.428151 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9"] Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.443155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2rq\" (UniqueName: \"kubernetes.io/projected/6fe04296-377c-43a6-af79-eed85f760bf9-kube-api-access-kn2rq\") pod \"openstack-operator-controller-init-657bc7dd6f-zd6k9\" (UID: \"6fe04296-377c-43a6-af79-eed85f760bf9\") " pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.544264 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2rq\" (UniqueName: \"kubernetes.io/projected/6fe04296-377c-43a6-af79-eed85f760bf9-kube-api-access-kn2rq\") pod \"openstack-operator-controller-init-657bc7dd6f-zd6k9\" (UID: \"6fe04296-377c-43a6-af79-eed85f760bf9\") " pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.561962 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2rq\" (UniqueName: \"kubernetes.io/projected/6fe04296-377c-43a6-af79-eed85f760bf9-kube-api-access-kn2rq\") pod \"openstack-operator-controller-init-657bc7dd6f-zd6k9\" (UID: \"6fe04296-377c-43a6-af79-eed85f760bf9\") " pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:45:37 crc kubenswrapper[4886]: I0314 08:45:37.711210 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:45:38 crc kubenswrapper[4886]: I0314 08:45:38.153814 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9"] Mar 14 08:45:38 crc kubenswrapper[4886]: I0314 08:45:38.840048 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" event={"ID":"6fe04296-377c-43a6-af79-eed85f760bf9","Type":"ContainerStarted","Data":"5ac1d2b3230676360e3aa9f84200b0beaa3edbbee903c2b15b8781b910aaf6be"} Mar 14 08:45:42 crc kubenswrapper[4886]: I0314 08:45:42.875153 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" event={"ID":"6fe04296-377c-43a6-af79-eed85f760bf9","Type":"ContainerStarted","Data":"77eb4d6db320118e6ff5643c5ef3c1facea7f9abe1064c7ab98cd5b162d712d2"} Mar 14 08:45:42 crc kubenswrapper[4886]: I0314 08:45:42.875728 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:45:42 crc kubenswrapper[4886]: I0314 08:45:42.908529 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" podStartSLOduration=2.142275226 podStartE2EDuration="5.908509025s" podCreationTimestamp="2026-03-14 08:45:37 +0000 UTC" firstStartedPulling="2026-03-14 08:45:38.157181297 +0000 UTC m=+1073.405632934" lastFinishedPulling="2026-03-14 08:45:41.923415096 +0000 UTC m=+1077.171866733" observedRunningTime="2026-03-14 08:45:42.902744329 +0000 UTC m=+1078.151195956" watchObservedRunningTime="2026-03-14 08:45:42.908509025 +0000 UTC m=+1078.156960662" Mar 14 08:45:47 crc kubenswrapper[4886]: I0314 08:45:47.714718 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-657bc7dd6f-zd6k9" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.137847 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557966-729xs"] Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.140059 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.142083 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.143467 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.143646 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.155077 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-729xs"] Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.157348 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ssnx\" (UniqueName: \"kubernetes.io/projected/12215c1a-53af-4b8b-9b8c-1333f74942ce-kube-api-access-6ssnx\") pod \"auto-csr-approver-29557966-729xs\" (UID: \"12215c1a-53af-4b8b-9b8c-1333f74942ce\") " pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.258350 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ssnx\" (UniqueName: \"kubernetes.io/projected/12215c1a-53af-4b8b-9b8c-1333f74942ce-kube-api-access-6ssnx\") pod \"auto-csr-approver-29557966-729xs\" (UID: \"12215c1a-53af-4b8b-9b8c-1333f74942ce\") " pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.277631 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ssnx\" (UniqueName: \"kubernetes.io/projected/12215c1a-53af-4b8b-9b8c-1333f74942ce-kube-api-access-6ssnx\") pod \"auto-csr-approver-29557966-729xs\" (UID: \"12215c1a-53af-4b8b-9b8c-1333f74942ce\") " pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.459993 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.904549 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-729xs"] Mar 14 08:46:00 crc kubenswrapper[4886]: I0314 08:46:00.984740 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-729xs" event={"ID":"12215c1a-53af-4b8b-9b8c-1333f74942ce","Type":"ContainerStarted","Data":"2b5759062a434204a63eb06ecce25de29cdd31e9ac65279b0dfd66db529aeabe"} Mar 14 08:46:02 crc kubenswrapper[4886]: I0314 08:46:02.998577 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-729xs" event={"ID":"12215c1a-53af-4b8b-9b8c-1333f74942ce","Type":"ContainerStarted","Data":"756de37cbd5f5e14f06c8cbf1d068336c4156f6c08dba8123691c00b5d05b006"} Mar 14 08:46:04 crc kubenswrapper[4886]: I0314 08:46:04.006156 4886 generic.go:334] "Generic (PLEG): container finished" podID="12215c1a-53af-4b8b-9b8c-1333f74942ce" containerID="756de37cbd5f5e14f06c8cbf1d068336c4156f6c08dba8123691c00b5d05b006" exitCode=0 Mar 14 08:46:04 crc kubenswrapper[4886]: I0314 08:46:04.006451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-729xs" event={"ID":"12215c1a-53af-4b8b-9b8c-1333f74942ce","Type":"ContainerDied","Data":"756de37cbd5f5e14f06c8cbf1d068336c4156f6c08dba8123691c00b5d05b006"} Mar 14 08:46:05 crc kubenswrapper[4886]: I0314 08:46:05.285341 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:05 crc kubenswrapper[4886]: I0314 08:46:05.452848 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ssnx\" (UniqueName: \"kubernetes.io/projected/12215c1a-53af-4b8b-9b8c-1333f74942ce-kube-api-access-6ssnx\") pod \"12215c1a-53af-4b8b-9b8c-1333f74942ce\" (UID: \"12215c1a-53af-4b8b-9b8c-1333f74942ce\") " Mar 14 08:46:05 crc kubenswrapper[4886]: I0314 08:46:05.458813 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12215c1a-53af-4b8b-9b8c-1333f74942ce-kube-api-access-6ssnx" (OuterVolumeSpecName: "kube-api-access-6ssnx") pod "12215c1a-53af-4b8b-9b8c-1333f74942ce" (UID: "12215c1a-53af-4b8b-9b8c-1333f74942ce"). InnerVolumeSpecName "kube-api-access-6ssnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:46:05 crc kubenswrapper[4886]: I0314 08:46:05.555844 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ssnx\" (UniqueName: \"kubernetes.io/projected/12215c1a-53af-4b8b-9b8c-1333f74942ce-kube-api-access-6ssnx\") on node \"crc\" DevicePath \"\"" Mar 14 08:46:06 crc kubenswrapper[4886]: I0314 08:46:06.021088 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557966-729xs" event={"ID":"12215c1a-53af-4b8b-9b8c-1333f74942ce","Type":"ContainerDied","Data":"2b5759062a434204a63eb06ecce25de29cdd31e9ac65279b0dfd66db529aeabe"} Mar 14 08:46:06 crc kubenswrapper[4886]: I0314 08:46:06.021158 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b5759062a434204a63eb06ecce25de29cdd31e9ac65279b0dfd66db529aeabe" Mar 14 08:46:06 crc kubenswrapper[4886]: I0314 08:46:06.021219 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557966-729xs" Mar 14 08:46:06 crc kubenswrapper[4886]: I0314 08:46:06.063955 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-gjsxp"] Mar 14 08:46:06 crc kubenswrapper[4886]: I0314 08:46:06.078992 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557960-gjsxp"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.430012 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9be99ded-dcd2-4773-902b-10b10955e202" path="/var/lib/kubelet/pods/9be99ded-dcd2-4773-902b-10b10955e202/volumes" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.652491 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6"] Mar 14 08:46:07 crc kubenswrapper[4886]: E0314 08:46:07.652764 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12215c1a-53af-4b8b-9b8c-1333f74942ce" containerName="oc" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.652779 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="12215c1a-53af-4b8b-9b8c-1333f74942ce" containerName="oc" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.652898 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="12215c1a-53af-4b8b-9b8c-1333f74942ce" containerName="oc" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.653316 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.656915 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nnd64" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.673543 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.679458 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.680540 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.681999 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-l6wvq" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.686031 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.689173 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.692719 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.693827 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.697039 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-95289" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.698405 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8bxcd" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.716361 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.754927 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.787873 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrcf\" (UniqueName: \"kubernetes.io/projected/f220ca81-44c2-4dd1-8fff-616ed5060946-kube-api-access-rqrcf\") pod \"barbican-operator-controller-manager-d47688694-ch5j6\" (UID: \"f220ca81-44c2-4dd1-8fff-616ed5060946\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.788015 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhbzl\" (UniqueName: \"kubernetes.io/projected/f9db1eb1-8050-466c-aa04-87ee4dc1479c-kube-api-access-hhbzl\") pod \"cinder-operator-controller-manager-984cd4dcf-t477h\" (UID: \"f9db1eb1-8050-466c-aa04-87ee4dc1479c\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.801452 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.802564 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.806356 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-d6w46" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.811292 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.820394 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.844233 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.845300 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.856055 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ghkwc" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.872954 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.888938 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhbzl\" (UniqueName: \"kubernetes.io/projected/f9db1eb1-8050-466c-aa04-87ee4dc1479c-kube-api-access-hhbzl\") pod \"cinder-operator-controller-manager-984cd4dcf-t477h\" (UID: \"f9db1eb1-8050-466c-aa04-87ee4dc1479c\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.889003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7m7g\" (UniqueName: \"kubernetes.io/projected/5e8f3028-4d1b-4b90-910f-84c2f9e72f45-kube-api-access-w7m7g\") pod \"designate-operator-controller-manager-66d56f6ff4-n42dq\" (UID: \"5e8f3028-4d1b-4b90-910f-84c2f9e72f45\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.889051 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62j9g\" (UniqueName: \"kubernetes.io/projected/4ac77887-a632-454d-9460-1150a439045a-kube-api-access-62j9g\") pod \"glance-operator-controller-manager-5964f64c48-2knnh\" (UID: \"4ac77887-a632-454d-9460-1150a439045a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.889127 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrcf\" (UniqueName: \"kubernetes.io/projected/f220ca81-44c2-4dd1-8fff-616ed5060946-kube-api-access-rqrcf\") pod \"barbican-operator-controller-manager-d47688694-ch5j6\" (UID: \"f220ca81-44c2-4dd1-8fff-616ed5060946\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.894192 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.895150 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.901446 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.901645 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ls5lw" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.913134 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.918757 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhbzl\" (UniqueName: \"kubernetes.io/projected/f9db1eb1-8050-466c-aa04-87ee4dc1479c-kube-api-access-hhbzl\") pod \"cinder-operator-controller-manager-984cd4dcf-t477h\" (UID: \"f9db1eb1-8050-466c-aa04-87ee4dc1479c\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.918759 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrcf\" (UniqueName: \"kubernetes.io/projected/f220ca81-44c2-4dd1-8fff-616ed5060946-kube-api-access-rqrcf\") pod \"barbican-operator-controller-manager-d47688694-ch5j6\" (UID: \"f220ca81-44c2-4dd1-8fff-616ed5060946\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.953591 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.954568 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.958610 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9xr25" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.968183 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.969828 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.971035 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.972313 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xchqg" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.976193 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.984472 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd"] Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.985482 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.989990 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzxf5\" (UniqueName: \"kubernetes.io/projected/43749f37-8afe-4259-bf55-3e7842b14a14-kube-api-access-nzxf5\") pod \"heat-operator-controller-manager-77b6666d85-8b92g\" (UID: \"43749f37-8afe-4259-bf55-3e7842b14a14\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.990051 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7m7g\" (UniqueName: \"kubernetes.io/projected/5e8f3028-4d1b-4b90-910f-84c2f9e72f45-kube-api-access-w7m7g\") pod \"designate-operator-controller-manager-66d56f6ff4-n42dq\" (UID: \"5e8f3028-4d1b-4b90-910f-84c2f9e72f45\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.990104 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62j9g\" (UniqueName: \"kubernetes.io/projected/4ac77887-a632-454d-9460-1150a439045a-kube-api-access-62j9g\") pod \"glance-operator-controller-manager-5964f64c48-2knnh\" (UID: \"4ac77887-a632-454d-9460-1150a439045a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.990165 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqqff\" (UniqueName: \"kubernetes.io/projected/b360b11e-b7a7-4b56-969f-3bef111a22b7-kube-api-access-tqqff\") pod \"horizon-operator-controller-manager-6d9d6b584d-mpz5b\" (UID: \"b360b11e-b7a7-4b56-969f-3bef111a22b7\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:07 crc kubenswrapper[4886]: I0314 08:46:07.994756 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zd9n4" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.004338 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.018322 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.025371 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7m7g\" (UniqueName: \"kubernetes.io/projected/5e8f3028-4d1b-4b90-910f-84c2f9e72f45-kube-api-access-w7m7g\") pod \"designate-operator-controller-manager-66d56f6ff4-n42dq\" (UID: \"5e8f3028-4d1b-4b90-910f-84c2f9e72f45\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.027102 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62j9g\" (UniqueName: \"kubernetes.io/projected/4ac77887-a632-454d-9460-1150a439045a-kube-api-access-62j9g\") pod \"glance-operator-controller-manager-5964f64c48-2knnh\" (UID: \"4ac77887-a632-454d-9460-1150a439045a\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.034381 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.048158 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.061161 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.065255 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.068336 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qpqzz" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.069943 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.072870 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.078159 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c98p5" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.087032 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.087970 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091195 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm92b\" (UniqueName: \"kubernetes.io/projected/ed888cc0-96e8-4507-9659-1a710d2fcb41-kube-api-access-wm92b\") pod \"manila-operator-controller-manager-57b484b4df-lclvd\" (UID: \"ed888cc0-96e8-4507-9659-1a710d2fcb41\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091233 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzxf5\" (UniqueName: \"kubernetes.io/projected/43749f37-8afe-4259-bf55-3e7842b14a14-kube-api-access-nzxf5\") pod \"heat-operator-controller-manager-77b6666d85-8b92g\" (UID: \"43749f37-8afe-4259-bf55-3e7842b14a14\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091281 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4fln\" (UniqueName: \"kubernetes.io/projected/243e7e8f-27ed-4052-b11f-51887ee5d8d7-kube-api-access-k4fln\") pod \"keystone-operator-controller-manager-684f77d66d-kc4s8\" (UID: \"243e7e8f-27ed-4052-b11f-51887ee5d8d7\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091300 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrsk\" (UniqueName: \"kubernetes.io/projected/e4a52d95-3fd2-48ad-8d53-cb790dbf34f6-kube-api-access-7wrsk\") pod \"ironic-operator-controller-manager-5bc894d9b-4v86k\" (UID: \"e4a52d95-3fd2-48ad-8d53-cb790dbf34f6\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6fn\" (UniqueName: \"kubernetes.io/projected/84978a63-3814-485e-9902-7d041f79179d-kube-api-access-9d6fn\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091354 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqqff\" (UniqueName: \"kubernetes.io/projected/b360b11e-b7a7-4b56-969f-3bef111a22b7-kube-api-access-tqqff\") pod \"horizon-operator-controller-manager-6d9d6b584d-mpz5b\" (UID: \"b360b11e-b7a7-4b56-969f-3bef111a22b7\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.091404 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.093027 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-68kdn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.101377 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.126319 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqqff\" (UniqueName: \"kubernetes.io/projected/b360b11e-b7a7-4b56-969f-3bef111a22b7-kube-api-access-tqqff\") pod \"horizon-operator-controller-manager-6d9d6b584d-mpz5b\" (UID: \"b360b11e-b7a7-4b56-969f-3bef111a22b7\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.131167 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzxf5\" (UniqueName: \"kubernetes.io/projected/43749f37-8afe-4259-bf55-3e7842b14a14-kube-api-access-nzxf5\") pod \"heat-operator-controller-manager-77b6666d85-8b92g\" (UID: \"43749f37-8afe-4259-bf55-3e7842b14a14\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.133950 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.146419 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.155532 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.156746 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.160961 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-q95wp" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.167569 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.170042 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.190324 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.191183 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192151 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4fln\" (UniqueName: \"kubernetes.io/projected/243e7e8f-27ed-4052-b11f-51887ee5d8d7-kube-api-access-k4fln\") pod \"keystone-operator-controller-manager-684f77d66d-kc4s8\" (UID: \"243e7e8f-27ed-4052-b11f-51887ee5d8d7\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192207 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrsk\" (UniqueName: \"kubernetes.io/projected/e4a52d95-3fd2-48ad-8d53-cb790dbf34f6-kube-api-access-7wrsk\") pod \"ironic-operator-controller-manager-5bc894d9b-4v86k\" (UID: \"e4a52d95-3fd2-48ad-8d53-cb790dbf34f6\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192237 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6fn\" (UniqueName: \"kubernetes.io/projected/84978a63-3814-485e-9902-7d041f79179d-kube-api-access-9d6fn\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192261 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6kz\" (UniqueName: \"kubernetes.io/projected/61eb1055-3710-4b6b-81d2-40206feec055-kube-api-access-sh6kz\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl\" (UID: \"61eb1055-3710-4b6b-81d2-40206feec055\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192299 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpr8h\" (UniqueName: \"kubernetes.io/projected/4ad13bb8-51ec-43a6-b061-8485881110b0-kube-api-access-mpr8h\") pod \"neutron-operator-controller-manager-776c5696bf-kgkzn\" (UID: \"4ad13bb8-51ec-43a6-b061-8485881110b0\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192317 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192344 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm92b\" (UniqueName: \"kubernetes.io/projected/ed888cc0-96e8-4507-9659-1a710d2fcb41-kube-api-access-wm92b\") pod \"manila-operator-controller-manager-57b484b4df-lclvd\" (UID: \"ed888cc0-96e8-4507-9659-1a710d2fcb41\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.192382 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4sc6\" (UniqueName: \"kubernetes.io/projected/8d2bbc98-8080-4e83-b31a-049a347cccb6-kube-api-access-f4sc6\") pod \"nova-operator-controller-manager-7f84474648-v2dwg\" (UID: \"8d2bbc98-8080-4e83-b31a-049a347cccb6\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.192907 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.192949 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert podName:84978a63-3814-485e-9902-7d041f79179d nodeName:}" failed. No retries permitted until 2026-03-14 08:46:08.692935345 +0000 UTC m=+1103.941386982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-nh5lk" (UID: "84978a63-3814-485e-9902-7d041f79179d") : secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.197460 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rg95p" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.202607 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.203365 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.204902 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4src9" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.211226 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.218770 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.219555 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrsk\" (UniqueName: \"kubernetes.io/projected/e4a52d95-3fd2-48ad-8d53-cb790dbf34f6-kube-api-access-7wrsk\") pod \"ironic-operator-controller-manager-5bc894d9b-4v86k\" (UID: \"e4a52d95-3fd2-48ad-8d53-cb790dbf34f6\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.219636 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.223464 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-dk6gz" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.232616 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4fln\" (UniqueName: \"kubernetes.io/projected/243e7e8f-27ed-4052-b11f-51887ee5d8d7-kube-api-access-k4fln\") pod \"keystone-operator-controller-manager-684f77d66d-kc4s8\" (UID: \"243e7e8f-27ed-4052-b11f-51887ee5d8d7\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.232877 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6fn\" (UniqueName: \"kubernetes.io/projected/84978a63-3814-485e-9902-7d041f79179d-kube-api-access-9d6fn\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.239479 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm92b\" (UniqueName: \"kubernetes.io/projected/ed888cc0-96e8-4507-9659-1a710d2fcb41-kube-api-access-wm92b\") pod \"manila-operator-controller-manager-57b484b4df-lclvd\" (UID: \"ed888cc0-96e8-4507-9659-1a710d2fcb41\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.241102 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.282931 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.297792 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4sc6\" (UniqueName: \"kubernetes.io/projected/8d2bbc98-8080-4e83-b31a-049a347cccb6-kube-api-access-f4sc6\") pod \"nova-operator-controller-manager-7f84474648-v2dwg\" (UID: \"8d2bbc98-8080-4e83-b31a-049a347cccb6\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.297916 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddjm\" (UniqueName: \"kubernetes.io/projected/b74c6843-f5c8-465e-aa1e-350d6329567d-kube-api-access-dddjm\") pod \"octavia-operator-controller-manager-5f4f55cb5c-q8d2w\" (UID: \"b74c6843-f5c8-465e-aa1e-350d6329567d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.297965 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6kz\" (UniqueName: \"kubernetes.io/projected/61eb1055-3710-4b6b-81d2-40206feec055-kube-api-access-sh6kz\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl\" (UID: \"61eb1055-3710-4b6b-81d2-40206feec055\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.298038 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crtl8\" (UniqueName: \"kubernetes.io/projected/4c49fd84-e358-4ec4-a05c-8c3728dc1824-kube-api-access-crtl8\") pod \"ovn-operator-controller-manager-bbc5b68f9-vp756\" (UID: \"4c49fd84-e358-4ec4-a05c-8c3728dc1824\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.298094 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpr8h\" (UniqueName: \"kubernetes.io/projected/4ad13bb8-51ec-43a6-b061-8485881110b0-kube-api-access-mpr8h\") pod \"neutron-operator-controller-manager-776c5696bf-kgkzn\" (UID: \"4ad13bb8-51ec-43a6-b061-8485881110b0\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.303967 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.305762 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.311769 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5xv69" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.312069 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.315877 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.322951 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4sc6\" (UniqueName: \"kubernetes.io/projected/8d2bbc98-8080-4e83-b31a-049a347cccb6-kube-api-access-f4sc6\") pod \"nova-operator-controller-manager-7f84474648-v2dwg\" (UID: \"8d2bbc98-8080-4e83-b31a-049a347cccb6\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.326512 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.329138 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6kz\" (UniqueName: \"kubernetes.io/projected/61eb1055-3710-4b6b-81d2-40206feec055-kube-api-access-sh6kz\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl\" (UID: \"61eb1055-3710-4b6b-81d2-40206feec055\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.332330 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpr8h\" (UniqueName: \"kubernetes.io/projected/4ad13bb8-51ec-43a6-b061-8485881110b0-kube-api-access-mpr8h\") pod \"neutron-operator-controller-manager-776c5696bf-kgkzn\" (UID: \"4ad13bb8-51ec-43a6-b061-8485881110b0\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.359202 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.375328 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.379549 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.382718 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.388425 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.390059 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.390817 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jnbml" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.399084 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvw2n\" (UniqueName: \"kubernetes.io/projected/28c956a0-8e35-4f54-a453-f837ada794c7-kube-api-access-gvw2n\") pod \"placement-operator-controller-manager-574d45c66c-sx68d\" (UID: \"28c956a0-8e35-4f54-a453-f837ada794c7\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.399187 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddjm\" (UniqueName: \"kubernetes.io/projected/b74c6843-f5c8-465e-aa1e-350d6329567d-kube-api-access-dddjm\") pod \"octavia-operator-controller-manager-5f4f55cb5c-q8d2w\" (UID: \"b74c6843-f5c8-465e-aa1e-350d6329567d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.399224 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8nh4\" (UniqueName: \"kubernetes.io/projected/d029e96c-9b5c-4095-86bc-bcac7b633fe5-kube-api-access-l8nh4\") pod \"swift-operator-controller-manager-7f9cc5dd44-kh5qj\" (UID: \"d029e96c-9b5c-4095-86bc-bcac7b633fe5\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.399266 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crtl8\" (UniqueName: \"kubernetes.io/projected/4c49fd84-e358-4ec4-a05c-8c3728dc1824-kube-api-access-crtl8\") pod \"ovn-operator-controller-manager-bbc5b68f9-vp756\" (UID: \"4c49fd84-e358-4ec4-a05c-8c3728dc1824\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.415902 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.429755 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crtl8\" (UniqueName: \"kubernetes.io/projected/4c49fd84-e358-4ec4-a05c-8c3728dc1824-kube-api-access-crtl8\") pod \"ovn-operator-controller-manager-bbc5b68f9-vp756\" (UID: \"4c49fd84-e358-4ec4-a05c-8c3728dc1824\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.430045 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.436206 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.445059 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddjm\" (UniqueName: \"kubernetes.io/projected/b74c6843-f5c8-465e-aa1e-350d6329567d-kube-api-access-dddjm\") pod \"octavia-operator-controller-manager-5f4f55cb5c-q8d2w\" (UID: \"b74c6843-f5c8-465e-aa1e-350d6329567d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.455813 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.458650 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.460590 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.467895 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-wh9hv" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.470833 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.491543 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.501370 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcrfm\" (UniqueName: \"kubernetes.io/projected/7c968620-eafc-42fe-b2dc-a86b4fa845d5-kube-api-access-pcrfm\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.501432 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45zf\" (UniqueName: \"kubernetes.io/projected/28443df6-3421-46cb-9011-d8b47769fbfa-kube-api-access-x45zf\") pod \"telemetry-operator-controller-manager-6854b8b9d9-ssmck\" (UID: \"28443df6-3421-46cb-9011-d8b47769fbfa\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.501489 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvw2n\" (UniqueName: \"kubernetes.io/projected/28c956a0-8e35-4f54-a453-f837ada794c7-kube-api-access-gvw2n\") pod \"placement-operator-controller-manager-574d45c66c-sx68d\" (UID: \"28c956a0-8e35-4f54-a453-f837ada794c7\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.501551 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8nh4\" (UniqueName: \"kubernetes.io/projected/d029e96c-9b5c-4095-86bc-bcac7b633fe5-kube-api-access-l8nh4\") pod \"swift-operator-controller-manager-7f9cc5dd44-kh5qj\" (UID: \"d029e96c-9b5c-4095-86bc-bcac7b633fe5\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.501580 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.503805 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.506063 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.517233 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.519921 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-p28lz" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.523052 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvw2n\" (UniqueName: \"kubernetes.io/projected/28c956a0-8e35-4f54-a453-f837ada794c7-kube-api-access-gvw2n\") pod \"placement-operator-controller-manager-574d45c66c-sx68d\" (UID: \"28c956a0-8e35-4f54-a453-f837ada794c7\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.530192 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.531609 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.531966 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8nh4\" (UniqueName: \"kubernetes.io/projected/d029e96c-9b5c-4095-86bc-bcac7b633fe5-kube-api-access-l8nh4\") pod \"swift-operator-controller-manager-7f9cc5dd44-kh5qj\" (UID: \"d029e96c-9b5c-4095-86bc-bcac7b633fe5\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.537219 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2xrbb" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.537513 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.537723 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.542458 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.563557 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.571229 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.578142 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.584024 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-4fn5z" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.592776 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.598821 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.600298 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.605446 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kwzh\" (UniqueName: \"kubernetes.io/projected/79e45051-6db9-4014-98f4-58bbddbb2edc-kube-api-access-7kwzh\") pod \"watcher-operator-controller-manager-7f57d95748-862jt\" (UID: \"79e45051-6db9-4014-98f4-58bbddbb2edc\") " pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.605512 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.605594 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcrfm\" (UniqueName: \"kubernetes.io/projected/7c968620-eafc-42fe-b2dc-a86b4fa845d5-kube-api-access-pcrfm\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.605634 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x45zf\" (UniqueName: \"kubernetes.io/projected/28443df6-3421-46cb-9011-d8b47769fbfa-kube-api-access-x45zf\") pod \"telemetry-operator-controller-manager-6854b8b9d9-ssmck\" (UID: \"28443df6-3421-46cb-9011-d8b47769fbfa\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.605709 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsxv\" (UniqueName: \"kubernetes.io/projected/f24830d3-8cff-4071-952d-9065c1c39e4a-kube-api-access-7xsxv\") pod \"test-operator-controller-manager-5c5cb9c4d7-f4tbn\" (UID: \"f24830d3-8cff-4071-952d-9065c1c39e4a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.605913 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.605989 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert podName:7c968620-eafc-42fe-b2dc-a86b4fa845d5 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:09.105964694 +0000 UTC m=+1104.354416331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" (UID: "7c968620-eafc-42fe-b2dc-a86b4fa845d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.629213 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.634095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcrfm\" (UniqueName: \"kubernetes.io/projected/7c968620-eafc-42fe-b2dc-a86b4fa845d5-kube-api-access-pcrfm\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.642211 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45zf\" (UniqueName: \"kubernetes.io/projected/28443df6-3421-46cb-9011-d8b47769fbfa-kube-api-access-x45zf\") pod \"telemetry-operator-controller-manager-6854b8b9d9-ssmck\" (UID: \"28443df6-3421-46cb-9011-d8b47769fbfa\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.707101 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cz6g\" (UniqueName: \"kubernetes.io/projected/c521acb7-75ce-466f-90b7-caf5265ed209-kube-api-access-6cz6g\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.707845 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.707917 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.707962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7prp\" (UniqueName: \"kubernetes.io/projected/e579f2af-b976-45fe-ac83-a23c0676eaf2-kube-api-access-w7prp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2ghpz\" (UID: \"e579f2af-b976-45fe-ac83-a23c0676eaf2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.707994 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsxv\" (UniqueName: \"kubernetes.io/projected/f24830d3-8cff-4071-952d-9065c1c39e4a-kube-api-access-7xsxv\") pod \"test-operator-controller-manager-5c5cb9c4d7-f4tbn\" (UID: \"f24830d3-8cff-4071-952d-9065c1c39e4a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.708016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.708052 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kwzh\" (UniqueName: \"kubernetes.io/projected/79e45051-6db9-4014-98f4-58bbddbb2edc-kube-api-access-7kwzh\") pod \"watcher-operator-controller-manager-7f57d95748-862jt\" (UID: \"79e45051-6db9-4014-98f4-58bbddbb2edc\") " pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.708769 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.708866 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert podName:84978a63-3814-485e-9902-7d041f79179d nodeName:}" failed. No retries permitted until 2026-03-14 08:46:09.708842547 +0000 UTC m=+1104.957294184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-nh5lk" (UID: "84978a63-3814-485e-9902-7d041f79179d") : secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.730890 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kwzh\" (UniqueName: \"kubernetes.io/projected/79e45051-6db9-4014-98f4-58bbddbb2edc-kube-api-access-7kwzh\") pod \"watcher-operator-controller-manager-7f57d95748-862jt\" (UID: \"79e45051-6db9-4014-98f4-58bbddbb2edc\") " pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.733583 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsxv\" (UniqueName: \"kubernetes.io/projected/f24830d3-8cff-4071-952d-9065c1c39e4a-kube-api-access-7xsxv\") pod \"test-operator-controller-manager-5c5cb9c4d7-f4tbn\" (UID: \"f24830d3-8cff-4071-952d-9065c1c39e4a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.761171 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.769483 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.783870 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh"] Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.791527 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.809102 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.809185 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7prp\" (UniqueName: \"kubernetes.io/projected/e579f2af-b976-45fe-ac83-a23c0676eaf2-kube-api-access-w7prp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2ghpz\" (UID: \"e579f2af-b976-45fe-ac83-a23c0676eaf2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.809226 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.809331 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.809359 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cz6g\" (UniqueName: \"kubernetes.io/projected/c521acb7-75ce-466f-90b7-caf5265ed209-kube-api-access-6cz6g\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.809424 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:09.309375564 +0000 UTC m=+1104.557827201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "webhook-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.809697 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: E0314 08:46:08.809731 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:09.309721204 +0000 UTC m=+1104.558172831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "metrics-server-cert" not found Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.829244 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.859803 4886 scope.go:117] "RemoveContainer" containerID="8d3591afc96490593c6cc94d08d44608aa6a7bfd61ac0a75838464a9c63dcd54" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.861012 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7prp\" (UniqueName: \"kubernetes.io/projected/e579f2af-b976-45fe-ac83-a23c0676eaf2-kube-api-access-w7prp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2ghpz\" (UID: \"e579f2af-b976-45fe-ac83-a23c0676eaf2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" Mar 14 08:46:08 crc kubenswrapper[4886]: W0314 08:46:08.861365 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac77887_a632_454d_9460_1150a439045a.slice/crio-1d00c8798a5b2a8cb7e73817467827a4211739918daa819a201a584e97a00df9 WatchSource:0}: Error finding container 1d00c8798a5b2a8cb7e73817467827a4211739918daa819a201a584e97a00df9: Status 404 returned error can't find the container with id 1d00c8798a5b2a8cb7e73817467827a4211739918daa819a201a584e97a00df9 Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.875864 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cz6g\" (UniqueName: \"kubernetes.io/projected/c521acb7-75ce-466f-90b7-caf5265ed209-kube-api-access-6cz6g\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:08 crc kubenswrapper[4886]: I0314 08:46:08.925779 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.095952 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.114206 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" event={"ID":"4ac77887-a632-454d-9460-1150a439045a","Type":"ContainerStarted","Data":"1d00c8798a5b2a8cb7e73817467827a4211739918daa819a201a584e97a00df9"} Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.114743 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.114923 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.114979 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert podName:7c968620-eafc-42fe-b2dc-a86b4fa845d5 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:10.114963017 +0000 UTC m=+1105.363414654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" (UID: "7c968620-eafc-42fe-b2dc-a86b4fa845d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.117300 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" event={"ID":"f220ca81-44c2-4dd1-8fff-616ed5060946","Type":"ContainerStarted","Data":"b215b826a6dcc08e5a6713fb2b66b10486389d3327e313cd528133b1ecc10ba3"} Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.119301 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" event={"ID":"f9db1eb1-8050-466c-aa04-87ee4dc1479c","Type":"ContainerStarted","Data":"3b13485b5dc926c6e41129df63e5c05a007410b20dbc25501df69aa64e604026"} Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.195847 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.201753 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k"] Mar 14 08:46:09 crc kubenswrapper[4886]: W0314 08:46:09.246768 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e8f3028_4d1b_4b90_910f_84c2f9e72f45.slice/crio-a8594a44fecee4330f0d4d90da024660411df9b2e511a3aee6c8db1fb46d74b9 WatchSource:0}: Error finding container a8594a44fecee4330f0d4d90da024660411df9b2e511a3aee6c8db1fb46d74b9: Status 404 returned error can't find the container with id a8594a44fecee4330f0d4d90da024660411df9b2e511a3aee6c8db1fb46d74b9 Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.322363 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.322509 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.322714 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.322784 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:10.322762644 +0000 UTC m=+1105.571214281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "webhook-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.323162 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.323249 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:10.323226647 +0000 UTC m=+1105.571678274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "metrics-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.390793 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.396627 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.402660 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.404993 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.658423 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.681459 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.698803 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w"] Mar 14 08:46:09 crc kubenswrapper[4886]: W0314 08:46:09.699755 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c49fd84_e358_4ec4_a05c_8c3728dc1824.slice/crio-aa04d1f4abcf895ab9ee9fd19aef201f0e37ce221f1d02ccbe4ce8e716f69835 WatchSource:0}: Error finding container aa04d1f4abcf895ab9ee9fd19aef201f0e37ce221f1d02ccbe4ce8e716f69835: Status 404 returned error can't find the container with id aa04d1f4abcf895ab9ee9fd19aef201f0e37ce221f1d02ccbe4ce8e716f69835 Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.711106 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d"] Mar 14 08:46:09 crc kubenswrapper[4886]: W0314 08:46:09.712446 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d2bbc98_8080_4e83_b31a_049a347cccb6.slice/crio-f72abbdda0cd6c46926499ad4738a346658446ecc8715f7a55e8942e1da407a5 WatchSource:0}: Error finding container f72abbdda0cd6c46926499ad4738a346658446ecc8715f7a55e8942e1da407a5: Status 404 returned error can't find the container with id f72abbdda0cd6c46926499ad4738a346658446ecc8715f7a55e8942e1da407a5 Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.730610 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.730827 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.730884 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert podName:84978a63-3814-485e-9902-7d041f79179d nodeName:}" failed. No retries permitted until 2026-03-14 08:46:11.730864925 +0000 UTC m=+1106.979316572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-nh5lk" (UID: "84978a63-3814-485e-9902-7d041f79179d") : secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.730916 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl"] Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.733712 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gvw2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-sx68d_openstack-operators(28c956a0-8e35-4f54-a453-f837ada794c7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.733840 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4sc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-v2dwg_openstack-operators(8d2bbc98-8080-4e83-b31a-049a347cccb6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.734834 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" podUID="28c956a0-8e35-4f54-a453-f837ada794c7" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.734937 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" podUID="8d2bbc98-8080-4e83-b31a-049a347cccb6" Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.743931 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.827636 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt"] Mar 14 08:46:09 crc kubenswrapper[4886]: W0314 08:46:09.836935 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e45051_6db9_4014_98f4_58bbddbb2edc.slice/crio-8bbb6755a6807a2fce15ac267b05443a45ae14021f0432f9c1a70ca6cd4dd23e WatchSource:0}: Error finding container 8bbb6755a6807a2fce15ac267b05443a45ae14021f0432f9c1a70ca6cd4dd23e: Status 404 returned error can't find the container with id 8bbb6755a6807a2fce15ac267b05443a45ae14021f0432f9c1a70ca6cd4dd23e Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.900259 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz"] Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.911223 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn"] Mar 14 08:46:09 crc kubenswrapper[4886]: W0314 08:46:09.911453 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode579f2af_b976_45fe_ac83_a23c0676eaf2.slice/crio-27fbc42e339cfe472dd41e3709a3f61683dc44adaa50dac46feb588ef9496d03 WatchSource:0}: Error finding container 27fbc42e339cfe472dd41e3709a3f61683dc44adaa50dac46feb588ef9496d03: Status 404 returned error can't find the container with id 27fbc42e339cfe472dd41e3709a3f61683dc44adaa50dac46feb588ef9496d03 Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.915501 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7prp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-2ghpz_openstack-operators(e579f2af-b976-45fe-ac83-a23c0676eaf2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.917525 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" podUID="e579f2af-b976-45fe-ac83-a23c0676eaf2" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.930948 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x45zf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6854b8b9d9-ssmck_openstack-operators(28443df6-3421-46cb-9011-d8b47769fbfa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 14 08:46:09 crc kubenswrapper[4886]: E0314 08:46:09.932954 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" podUID="28443df6-3421-46cb-9011-d8b47769fbfa" Mar 14 08:46:09 crc kubenswrapper[4886]: I0314 08:46:09.935725 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck"] Mar 14 08:46:09 crc kubenswrapper[4886]: W0314 08:46:09.940466 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24830d3_8cff_4071_952d_9065c1c39e4a.slice/crio-437c46cb798786a0a98a275ccdae86702b1047a98cea8ac36429708921229011 WatchSource:0}: Error finding container 437c46cb798786a0a98a275ccdae86702b1047a98cea8ac36429708921229011: Status 404 returned error can't find the container with id 437c46cb798786a0a98a275ccdae86702b1047a98cea8ac36429708921229011 Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.128224 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" event={"ID":"e4a52d95-3fd2-48ad-8d53-cb790dbf34f6","Type":"ContainerStarted","Data":"a5314aa813655c06fe88b697697d183c2109491c284aa0eb4be25350ad449e13"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.129417 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" event={"ID":"243e7e8f-27ed-4052-b11f-51887ee5d8d7","Type":"ContainerStarted","Data":"a4e287ba0613b8b4f685681229ee59b5c0f3be63a2843682ad22435328dd297b"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.130917 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" event={"ID":"5e8f3028-4d1b-4b90-910f-84c2f9e72f45","Type":"ContainerStarted","Data":"a8594a44fecee4330f0d4d90da024660411df9b2e511a3aee6c8db1fb46d74b9"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.131963 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" event={"ID":"4c49fd84-e358-4ec4-a05c-8c3728dc1824","Type":"ContainerStarted","Data":"aa04d1f4abcf895ab9ee9fd19aef201f0e37ce221f1d02ccbe4ce8e716f69835"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.132899 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" event={"ID":"b360b11e-b7a7-4b56-969f-3bef111a22b7","Type":"ContainerStarted","Data":"05974d5a120dd97b6cc2ddd61b99f2df2f9b4a8633ae27c120397e441e6f4415"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.134093 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" event={"ID":"e579f2af-b976-45fe-ac83-a23c0676eaf2","Type":"ContainerStarted","Data":"27fbc42e339cfe472dd41e3709a3f61683dc44adaa50dac46feb588ef9496d03"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.135936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" event={"ID":"61eb1055-3710-4b6b-81d2-40206feec055","Type":"ContainerStarted","Data":"8747e3ea5caeadfa061b008db7c7ffd3e36607529ec0c3cdbf88156143087b3e"} Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.136058 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" podUID="e579f2af-b976-45fe-ac83-a23c0676eaf2" Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.137426 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.137618 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.137681 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert podName:7c968620-eafc-42fe-b2dc-a86b4fa845d5 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:12.137665882 +0000 UTC m=+1107.386117519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" (UID: "7c968620-eafc-42fe-b2dc-a86b4fa845d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.138819 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" event={"ID":"28c956a0-8e35-4f54-a453-f837ada794c7","Type":"ContainerStarted","Data":"20ff720c1686336ef0b2820d05c621af130b8a5bbdd41bd0bf9d9c25e3eaf8a5"} Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.139890 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" podUID="28c956a0-8e35-4f54-a453-f837ada794c7" Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.140183 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" event={"ID":"f24830d3-8cff-4071-952d-9065c1c39e4a","Type":"ContainerStarted","Data":"437c46cb798786a0a98a275ccdae86702b1047a98cea8ac36429708921229011"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.141261 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" event={"ID":"8d2bbc98-8080-4e83-b31a-049a347cccb6","Type":"ContainerStarted","Data":"f72abbdda0cd6c46926499ad4738a346658446ecc8715f7a55e8942e1da407a5"} Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.142623 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" podUID="8d2bbc98-8080-4e83-b31a-049a347cccb6" Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.143347 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" event={"ID":"b74c6843-f5c8-465e-aa1e-350d6329567d","Type":"ContainerStarted","Data":"e8c90e605bee9d228c51df9e9feb02f0f7b7a649d37aff7aaf36d01903658c58"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.150081 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" event={"ID":"28443df6-3421-46cb-9011-d8b47769fbfa","Type":"ContainerStarted","Data":"3146c615ff6d8b7ae327fe3cf501b1d9984f4f8f1cfb7208d194cd4bb2abce85"} Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.151614 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" podUID="28443df6-3421-46cb-9011-d8b47769fbfa" Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.151728 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" event={"ID":"43749f37-8afe-4259-bf55-3e7842b14a14","Type":"ContainerStarted","Data":"e46d397bd42d0892c424af9b7ec5f0be4e2a632b172175000b6a046f6ff7fa6e"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.153671 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" event={"ID":"4ad13bb8-51ec-43a6-b061-8485881110b0","Type":"ContainerStarted","Data":"2d2bb673235256f4a19f192171266f4b6e60f31812260c64e93dc63612434e4e"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.154884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" event={"ID":"d029e96c-9b5c-4095-86bc-bcac7b633fe5","Type":"ContainerStarted","Data":"06c7c72b8288f0105546aacab04a17875fad92524a07af3c4ecd61d09efaf08d"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.156159 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" event={"ID":"79e45051-6db9-4014-98f4-58bbddbb2edc","Type":"ContainerStarted","Data":"8bbb6755a6807a2fce15ac267b05443a45ae14021f0432f9c1a70ca6cd4dd23e"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.157490 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" event={"ID":"ed888cc0-96e8-4507-9659-1a710d2fcb41","Type":"ContainerStarted","Data":"f7118beb5a76082f13a1f0c65254eb44492a66b4de25be8a795fc2cfeb0de800"} Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.340952 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:10 crc kubenswrapper[4886]: I0314 08:46:10.341075 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.341256 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.341359 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:12.341313487 +0000 UTC m=+1107.589765124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "metrics-server-cert" not found Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.341201 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 08:46:10 crc kubenswrapper[4886]: E0314 08:46:10.341596 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:12.341552304 +0000 UTC m=+1107.590003941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "webhook-server-cert" not found Mar 14 08:46:11 crc kubenswrapper[4886]: E0314 08:46:11.167110 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" podUID="28443df6-3421-46cb-9011-d8b47769fbfa" Mar 14 08:46:11 crc kubenswrapper[4886]: E0314 08:46:11.167970 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" podUID="8d2bbc98-8080-4e83-b31a-049a347cccb6" Mar 14 08:46:11 crc kubenswrapper[4886]: E0314 08:46:11.168442 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" podUID="28c956a0-8e35-4f54-a453-f837ada794c7" Mar 14 08:46:11 crc kubenswrapper[4886]: E0314 08:46:11.168506 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" podUID="e579f2af-b976-45fe-ac83-a23c0676eaf2" Mar 14 08:46:11 crc kubenswrapper[4886]: I0314 08:46:11.762916 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:11 crc kubenswrapper[4886]: E0314 08:46:11.763159 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:11 crc kubenswrapper[4886]: E0314 08:46:11.763257 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert podName:84978a63-3814-485e-9902-7d041f79179d nodeName:}" failed. No retries permitted until 2026-03-14 08:46:15.763233922 +0000 UTC m=+1111.011685619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-nh5lk" (UID: "84978a63-3814-485e-9902-7d041f79179d") : secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:12 crc kubenswrapper[4886]: I0314 08:46:12.168757 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:12 crc kubenswrapper[4886]: E0314 08:46:12.168984 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:12 crc kubenswrapper[4886]: E0314 08:46:12.169108 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert podName:7c968620-eafc-42fe-b2dc-a86b4fa845d5 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:16.169090132 +0000 UTC m=+1111.417541769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" (UID: "7c968620-eafc-42fe-b2dc-a86b4fa845d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:12 crc kubenswrapper[4886]: I0314 08:46:12.372643 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:12 crc kubenswrapper[4886]: I0314 08:46:12.372711 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:12 crc kubenswrapper[4886]: E0314 08:46:12.372826 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 08:46:12 crc kubenswrapper[4886]: E0314 08:46:12.372878 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:16.372862851 +0000 UTC m=+1111.621314488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "metrics-server-cert" not found Mar 14 08:46:12 crc kubenswrapper[4886]: E0314 08:46:12.375574 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 08:46:12 crc kubenswrapper[4886]: E0314 08:46:12.375758 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:16.375719663 +0000 UTC m=+1111.624171330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "webhook-server-cert" not found Mar 14 08:46:15 crc kubenswrapper[4886]: I0314 08:46:15.828730 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:15 crc kubenswrapper[4886]: E0314 08:46:15.828975 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:15 crc kubenswrapper[4886]: E0314 08:46:15.829286 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert podName:84978a63-3814-485e-9902-7d041f79179d nodeName:}" failed. No retries permitted until 2026-03-14 08:46:23.829258444 +0000 UTC m=+1119.077710081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-nh5lk" (UID: "84978a63-3814-485e-9902-7d041f79179d") : secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:16 crc kubenswrapper[4886]: I0314 08:46:16.234963 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:16 crc kubenswrapper[4886]: E0314 08:46:16.235208 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:16 crc kubenswrapper[4886]: E0314 08:46:16.235253 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert podName:7c968620-eafc-42fe-b2dc-a86b4fa845d5 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:24.235239807 +0000 UTC m=+1119.483691444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" (UID: "7c968620-eafc-42fe-b2dc-a86b4fa845d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:16 crc kubenswrapper[4886]: I0314 08:46:16.438269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:16 crc kubenswrapper[4886]: I0314 08:46:16.438456 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:16 crc kubenswrapper[4886]: E0314 08:46:16.438595 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 08:46:16 crc kubenswrapper[4886]: E0314 08:46:16.438588 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 08:46:16 crc kubenswrapper[4886]: E0314 08:46:16.438651 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:24.438635306 +0000 UTC m=+1119.687086943 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "webhook-server-cert" not found Mar 14 08:46:16 crc kubenswrapper[4886]: E0314 08:46:16.438725 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:24.438694477 +0000 UTC m=+1119.687146114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "metrics-server-cert" not found Mar 14 08:46:19 crc kubenswrapper[4886]: E0314 08:46:19.823411 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60" Mar 14 08:46:19 crc kubenswrapper[4886]: E0314 08:46:19.824112 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-62j9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5964f64c48-2knnh_openstack-operators(4ac77887-a632-454d-9460-1150a439045a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:46:19 crc kubenswrapper[4886]: E0314 08:46:19.825730 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" podUID="4ac77887-a632-454d-9460-1150a439045a" Mar 14 08:46:20 crc kubenswrapper[4886]: E0314 08:46:20.249233 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" podUID="4ac77887-a632-454d-9460-1150a439045a" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.260996 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" event={"ID":"d029e96c-9b5c-4095-86bc-bcac7b633fe5","Type":"ContainerStarted","Data":"6a593b18beaf05ad9df89ac4f96f961dd4d531e790904c37588fd951d0f18696"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.261545 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.262306 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" event={"ID":"79e45051-6db9-4014-98f4-58bbddbb2edc","Type":"ContainerStarted","Data":"6adf54aa4b63c7c7807e1801bed81a232bb3cacd2f519542edc3a652223e275f"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.262425 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.263641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" event={"ID":"4ad13bb8-51ec-43a6-b061-8485881110b0","Type":"ContainerStarted","Data":"79b75ede6ed4cb969a41f686ecfd019748f9f94f75c6b4ec437d810791be7d3d"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.263757 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.264791 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" event={"ID":"ed888cc0-96e8-4507-9659-1a710d2fcb41","Type":"ContainerStarted","Data":"def2ef93285fa07d3a15614944249b9cfe9dfe52593c9201bcd8f650dd3f07d5"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.264852 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.266161 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" event={"ID":"f220ca81-44c2-4dd1-8fff-616ed5060946","Type":"ContainerStarted","Data":"43c44fad9cefbbc6f91f66a3bdd10e59031e339ca0abc4d8cc70d1b61541e1c0"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.266285 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.267353 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" event={"ID":"e4a52d95-3fd2-48ad-8d53-cb790dbf34f6","Type":"ContainerStarted","Data":"40b7d0fc361fd832997e387b9932d22b4f905889b1f452402d4c50d95c244f48"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.267478 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.269268 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" event={"ID":"5e8f3028-4d1b-4b90-910f-84c2f9e72f45","Type":"ContainerStarted","Data":"243974d66dd1b7f45267492b613d60e6c889d4163a808cdc025c41e406f6d66b"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.269387 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.270662 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" event={"ID":"43749f37-8afe-4259-bf55-3e7842b14a14","Type":"ContainerStarted","Data":"7f4545b8b9eb67df00178f52a5bb29b8a490e39d07f5cea156a10695ef7ff035"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.270794 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.271776 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" event={"ID":"b74c6843-f5c8-465e-aa1e-350d6329567d","Type":"ContainerStarted","Data":"98bbf1f0b7d52db7ea48c81a2182ea320d50487cde1fe7f52f03f1a4dd571dfb"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.271942 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.273027 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" event={"ID":"61eb1055-3710-4b6b-81d2-40206feec055","Type":"ContainerStarted","Data":"decbed0ac1fde99c5d7467deb8bdf7d8ed1d631b7dd9739ed59fa2142483c12e"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.273091 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.274366 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" event={"ID":"f24830d3-8cff-4071-952d-9065c1c39e4a","Type":"ContainerStarted","Data":"cbcd27efce21e8adad51a5038872567d2dcb14e8b43991b9538a32ba6b1e0018"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.274455 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.275576 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" event={"ID":"4c49fd84-e358-4ec4-a05c-8c3728dc1824","Type":"ContainerStarted","Data":"01c5d89fdf76b5998ae122cd65ef4af8d29a4b3cb5c63d2aff5b25dc72d71776"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.275642 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.277225 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" event={"ID":"b360b11e-b7a7-4b56-969f-3bef111a22b7","Type":"ContainerStarted","Data":"a489b39fc32838f2ad88be2522129c5e9ca060bfb2b8f38266c48eaba90ce16b"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.277334 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.278503 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" event={"ID":"243e7e8f-27ed-4052-b11f-51887ee5d8d7","Type":"ContainerStarted","Data":"b436000e1db2c2715c39de99a0dce42e58d9e75dd744fa465ef30cdd5d37109c"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.278609 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.279751 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" event={"ID":"f9db1eb1-8050-466c-aa04-87ee4dc1479c","Type":"ContainerStarted","Data":"a4b6fd00a3490ba7f0ae1ed21884fb691397934ad7c9c053eda2bcebc614e4aa"} Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.279882 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.299561 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" podStartSLOduration=3.577676102 podStartE2EDuration="15.299542729s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.674093295 +0000 UTC m=+1104.922544942" lastFinishedPulling="2026-03-14 08:46:21.395959932 +0000 UTC m=+1116.644411569" observedRunningTime="2026-03-14 08:46:22.298759786 +0000 UTC m=+1117.547211423" watchObservedRunningTime="2026-03-14 08:46:22.299542729 +0000 UTC m=+1117.547994366" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.336811 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" podStartSLOduration=3.370081197 podStartE2EDuration="15.336794987s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.430800126 +0000 UTC m=+1104.679251763" lastFinishedPulling="2026-03-14 08:46:21.397513916 +0000 UTC m=+1116.645965553" observedRunningTime="2026-03-14 08:46:22.33057418 +0000 UTC m=+1117.579025817" watchObservedRunningTime="2026-03-14 08:46:22.336794987 +0000 UTC m=+1117.585246624" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.460036 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" podStartSLOduration=3.36727069 podStartE2EDuration="15.460011964s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.278135398 +0000 UTC m=+1104.526587035" lastFinishedPulling="2026-03-14 08:46:21.370876672 +0000 UTC m=+1116.619328309" observedRunningTime="2026-03-14 08:46:22.384008102 +0000 UTC m=+1117.632459739" watchObservedRunningTime="2026-03-14 08:46:22.460011964 +0000 UTC m=+1117.708463601" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.506840 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" podStartSLOduration=3.3678593660000002 podStartE2EDuration="15.506822918s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.257187705 +0000 UTC m=+1104.505639342" lastFinishedPulling="2026-03-14 08:46:21.396151257 +0000 UTC m=+1116.644602894" observedRunningTime="2026-03-14 08:46:22.463544565 +0000 UTC m=+1117.711996202" watchObservedRunningTime="2026-03-14 08:46:22.506822918 +0000 UTC m=+1117.755274555" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.510078 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" podStartSLOduration=3.627698782 podStartE2EDuration="15.510066051s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.432275499 +0000 UTC m=+1104.680727136" lastFinishedPulling="2026-03-14 08:46:21.314642778 +0000 UTC m=+1116.563094405" observedRunningTime="2026-03-14 08:46:22.502630517 +0000 UTC m=+1117.751082144" watchObservedRunningTime="2026-03-14 08:46:22.510066051 +0000 UTC m=+1117.758517688" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.547320 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" podStartSLOduration=3.651335009 podStartE2EDuration="15.54730367s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.419041977 +0000 UTC m=+1104.667493614" lastFinishedPulling="2026-03-14 08:46:21.315010638 +0000 UTC m=+1116.563462275" observedRunningTime="2026-03-14 08:46:22.538134487 +0000 UTC m=+1117.786586124" watchObservedRunningTime="2026-03-14 08:46:22.54730367 +0000 UTC m=+1117.795755307" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.595364 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" podStartSLOduration=3.882498341 podStartE2EDuration="15.595349629s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.705373273 +0000 UTC m=+1104.953824910" lastFinishedPulling="2026-03-14 08:46:21.418224561 +0000 UTC m=+1116.666676198" observedRunningTime="2026-03-14 08:46:22.59224046 +0000 UTC m=+1117.840692107" watchObservedRunningTime="2026-03-14 08:46:22.595349629 +0000 UTC m=+1117.843801266" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.633965 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" podStartSLOduration=3.96086105 podStartE2EDuration="15.633946547s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.70563152 +0000 UTC m=+1104.954083157" lastFinishedPulling="2026-03-14 08:46:21.378717017 +0000 UTC m=+1116.627168654" observedRunningTime="2026-03-14 08:46:22.629823088 +0000 UTC m=+1117.878274725" watchObservedRunningTime="2026-03-14 08:46:22.633946547 +0000 UTC m=+1117.882398184" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.719406 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" podStartSLOduration=3.770693185 podStartE2EDuration="15.719390359s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.430355773 +0000 UTC m=+1104.678807400" lastFinishedPulling="2026-03-14 08:46:21.379052937 +0000 UTC m=+1116.627504574" observedRunningTime="2026-03-14 08:46:22.680826942 +0000 UTC m=+1117.929278579" watchObservedRunningTime="2026-03-14 08:46:22.719390359 +0000 UTC m=+1117.967841996" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.720157 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" podStartSLOduration=3.596133886 podStartE2EDuration="15.720151741s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.246781865 +0000 UTC m=+1104.495233502" lastFinishedPulling="2026-03-14 08:46:21.37079972 +0000 UTC m=+1116.619251357" observedRunningTime="2026-03-14 08:46:22.712375368 +0000 UTC m=+1117.960827005" watchObservedRunningTime="2026-03-14 08:46:22.720151741 +0000 UTC m=+1117.968603378" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.808628 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" podStartSLOduration=4.154475657 podStartE2EDuration="15.80860732s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.710606313 +0000 UTC m=+1104.959057950" lastFinishedPulling="2026-03-14 08:46:21.364737976 +0000 UTC m=+1116.613189613" observedRunningTime="2026-03-14 08:46:22.805529662 +0000 UTC m=+1118.053981299" watchObservedRunningTime="2026-03-14 08:46:22.80860732 +0000 UTC m=+1118.057058957" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.809833 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" podStartSLOduration=3.299744805 podStartE2EDuration="15.809824225s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:08.734466656 +0000 UTC m=+1103.982918283" lastFinishedPulling="2026-03-14 08:46:21.244546066 +0000 UTC m=+1116.492997703" observedRunningTime="2026-03-14 08:46:22.770246609 +0000 UTC m=+1118.018698246" watchObservedRunningTime="2026-03-14 08:46:22.809824225 +0000 UTC m=+1118.058275862" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.867828 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" podStartSLOduration=3.398406259 podStartE2EDuration="15.86780996s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:08.846814162 +0000 UTC m=+1104.095265799" lastFinishedPulling="2026-03-14 08:46:21.316217863 +0000 UTC m=+1116.564669500" observedRunningTime="2026-03-14 08:46:22.839818976 +0000 UTC m=+1118.088270613" watchObservedRunningTime="2026-03-14 08:46:22.86780996 +0000 UTC m=+1118.116261587" Mar 14 08:46:22 crc kubenswrapper[4886]: I0314 08:46:22.869603 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" podStartSLOduration=3.403273169 podStartE2EDuration="14.869598401s" podCreationTimestamp="2026-03-14 08:46:08 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.9425161 +0000 UTC m=+1105.190967737" lastFinishedPulling="2026-03-14 08:46:21.408841332 +0000 UTC m=+1116.657292969" observedRunningTime="2026-03-14 08:46:22.865907775 +0000 UTC m=+1118.114359412" watchObservedRunningTime="2026-03-14 08:46:22.869598401 +0000 UTC m=+1118.118050038" Mar 14 08:46:23 crc kubenswrapper[4886]: I0314 08:46:23.855553 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:23 crc kubenswrapper[4886]: E0314 08:46:23.855723 4886 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:23 crc kubenswrapper[4886]: E0314 08:46:23.855788 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert podName:84978a63-3814-485e-9902-7d041f79179d nodeName:}" failed. No retries permitted until 2026-03-14 08:46:39.855757958 +0000 UTC m=+1135.104209595 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert") pod "infra-operator-controller-manager-54dc5b8f8d-nh5lk" (UID: "84978a63-3814-485e-9902-7d041f79179d") : secret "infra-operator-webhook-server-cert" not found Mar 14 08:46:24 crc kubenswrapper[4886]: I0314 08:46:24.261056 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:24 crc kubenswrapper[4886]: E0314 08:46:24.261273 4886 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:24 crc kubenswrapper[4886]: E0314 08:46:24.261460 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert podName:7c968620-eafc-42fe-b2dc-a86b4fa845d5 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:40.261444023 +0000 UTC m=+1135.509895660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" (UID: "7c968620-eafc-42fe-b2dc-a86b4fa845d5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 14 08:46:24 crc kubenswrapper[4886]: I0314 08:46:24.462961 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:24 crc kubenswrapper[4886]: I0314 08:46:24.463045 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:24 crc kubenswrapper[4886]: E0314 08:46:24.463339 4886 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 14 08:46:24 crc kubenswrapper[4886]: E0314 08:46:24.463408 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:40.46338514 +0000 UTC m=+1135.711836777 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "metrics-server-cert" not found Mar 14 08:46:24 crc kubenswrapper[4886]: E0314 08:46:24.464148 4886 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 14 08:46:24 crc kubenswrapper[4886]: E0314 08:46:24.464235 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs podName:c521acb7-75ce-466f-90b7-caf5265ed209 nodeName:}" failed. No retries permitted until 2026-03-14 08:46:40.464214034 +0000 UTC m=+1135.712665731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs") pod "openstack-operator-controller-manager-65755f6b77-wdwbk" (UID: "c521acb7-75ce-466f-90b7-caf5265ed209") : secret "webhook-server-cert" not found Mar 14 08:46:26 crc kubenswrapper[4886]: I0314 08:46:26.068737 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:46:26 crc kubenswrapper[4886]: I0314 08:46:26.069135 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.333087 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" event={"ID":"8d2bbc98-8080-4e83-b31a-049a347cccb6","Type":"ContainerStarted","Data":"8f87eb646de1405fc0af6a9601c44ada55c8ddb69dec6d0dcb76de10f5d9d326"} Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.334135 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.335979 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" event={"ID":"28c956a0-8e35-4f54-a453-f837ada794c7","Type":"ContainerStarted","Data":"a04f572e40260af4bc0941e3931bed64c7e07fa3237c36881813bba2c56631f3"} Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.336457 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.348439 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" podStartSLOduration=7.724836097 podStartE2EDuration="19.348423763s" podCreationTimestamp="2026-03-14 08:46:08 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.84078949 +0000 UTC m=+1105.089241127" lastFinishedPulling="2026-03-14 08:46:21.464377146 +0000 UTC m=+1116.712828793" observedRunningTime="2026-03-14 08:46:22.898937283 +0000 UTC m=+1118.147388920" watchObservedRunningTime="2026-03-14 08:46:27.348423763 +0000 UTC m=+1122.596875400" Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.353844 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" podStartSLOduration=3.565036759 podStartE2EDuration="20.353832568s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.733769428 +0000 UTC m=+1104.982221065" lastFinishedPulling="2026-03-14 08:46:26.522565237 +0000 UTC m=+1121.771016874" observedRunningTime="2026-03-14 08:46:27.349168264 +0000 UTC m=+1122.597619911" watchObservedRunningTime="2026-03-14 08:46:27.353832568 +0000 UTC m=+1122.602284205" Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.368925 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" podStartSLOduration=3.5858552870000002 podStartE2EDuration="20.368906871s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.733589803 +0000 UTC m=+1104.982041440" lastFinishedPulling="2026-03-14 08:46:26.516641387 +0000 UTC m=+1121.765093024" observedRunningTime="2026-03-14 08:46:27.366402339 +0000 UTC m=+1122.614853966" watchObservedRunningTime="2026-03-14 08:46:27.368906871 +0000 UTC m=+1122.617358508" Mar 14 08:46:27 crc kubenswrapper[4886]: I0314 08:46:27.973803 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-ch5j6" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.007964 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-t477h" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.173806 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-mpz5b" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.286278 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-4v86k" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.319790 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n42dq" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.389650 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc4s8" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.400501 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-lclvd" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.420741 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.434595 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-8b92g" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.441352 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kgkzn" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.495704 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-q8d2w" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.550610 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-vp756" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.619557 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-kh5qj" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.795477 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-f4tbn" Mar 14 08:46:28 crc kubenswrapper[4886]: I0314 08:46:28.834734 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f57d95748-862jt" Mar 14 08:46:38 crc kubenswrapper[4886]: I0314 08:46:38.458443 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-v2dwg" Mar 14 08:46:38 crc kubenswrapper[4886]: I0314 08:46:38.596979 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-sx68d" Mar 14 08:46:39 crc kubenswrapper[4886]: I0314 08:46:39.910282 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:39 crc kubenswrapper[4886]: I0314 08:46:39.917785 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84978a63-3814-485e-9902-7d041f79179d-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-nh5lk\" (UID: \"84978a63-3814-485e-9902-7d041f79179d\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.008670 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.316466 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.325908 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c968620-eafc-42fe-b2dc-a86b4fa845d5-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn\" (UID: \"7c968620-eafc-42fe-b2dc-a86b4fa845d5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.463344 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.521143 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.521255 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.526921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-metrics-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.527258 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c521acb7-75ce-466f-90b7-caf5265ed209-webhook-certs\") pod \"openstack-operator-controller-manager-65755f6b77-wdwbk\" (UID: \"c521acb7-75ce-466f-90b7-caf5265ed209\") " pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:40 crc kubenswrapper[4886]: I0314 08:46:40.665475 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:48 crc kubenswrapper[4886]: I0314 08:46:48.653409 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn"] Mar 14 08:46:48 crc kubenswrapper[4886]: W0314 08:46:48.660020 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c968620_eafc_42fe_b2dc_a86b4fa845d5.slice/crio-ed38dc0ea1716117380f134fefa5bc7fe1777ba6087f6e0306f210a809c1b08b WatchSource:0}: Error finding container ed38dc0ea1716117380f134fefa5bc7fe1777ba6087f6e0306f210a809c1b08b: Status 404 returned error can't find the container with id ed38dc0ea1716117380f134fefa5bc7fe1777ba6087f6e0306f210a809c1b08b Mar 14 08:46:48 crc kubenswrapper[4886]: I0314 08:46:48.704383 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk"] Mar 14 08:46:48 crc kubenswrapper[4886]: W0314 08:46:48.707644 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84978a63_3814_485e_9902_7d041f79179d.slice/crio-963824a37c463be5481fadcc2dd1b82cea37aad372c7a4cb73cce881389335a0 WatchSource:0}: Error finding container 963824a37c463be5481fadcc2dd1b82cea37aad372c7a4cb73cce881389335a0: Status 404 returned error can't find the container with id 963824a37c463be5481fadcc2dd1b82cea37aad372c7a4cb73cce881389335a0 Mar 14 08:46:48 crc kubenswrapper[4886]: I0314 08:46:48.714486 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk"] Mar 14 08:46:48 crc kubenswrapper[4886]: W0314 08:46:48.717274 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc521acb7_75ce_466f_90b7_caf5265ed209.slice/crio-e3658f477893b4f293667668973d1b5dec58169031df50555f6b5dc8ca392214 WatchSource:0}: Error finding container e3658f477893b4f293667668973d1b5dec58169031df50555f6b5dc8ca392214: Status 404 returned error can't find the container with id e3658f477893b4f293667668973d1b5dec58169031df50555f6b5dc8ca392214 Mar 14 08:46:49 crc kubenswrapper[4886]: I0314 08:46:49.518733 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" event={"ID":"c521acb7-75ce-466f-90b7-caf5265ed209","Type":"ContainerStarted","Data":"e0c4cfbaf87463433363f4b7218bd6392255d02587631ff390ac2a7b2e864165"} Mar 14 08:46:49 crc kubenswrapper[4886]: I0314 08:46:49.518784 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" event={"ID":"c521acb7-75ce-466f-90b7-caf5265ed209","Type":"ContainerStarted","Data":"e3658f477893b4f293667668973d1b5dec58169031df50555f6b5dc8ca392214"} Mar 14 08:46:49 crc kubenswrapper[4886]: I0314 08:46:49.519087 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:46:49 crc kubenswrapper[4886]: I0314 08:46:49.519900 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" event={"ID":"7c968620-eafc-42fe-b2dc-a86b4fa845d5","Type":"ContainerStarted","Data":"ed38dc0ea1716117380f134fefa5bc7fe1777ba6087f6e0306f210a809c1b08b"} Mar 14 08:46:49 crc kubenswrapper[4886]: I0314 08:46:49.520885 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" event={"ID":"84978a63-3814-485e-9902-7d041f79179d","Type":"ContainerStarted","Data":"963824a37c463be5481fadcc2dd1b82cea37aad372c7a4cb73cce881389335a0"} Mar 14 08:46:49 crc kubenswrapper[4886]: I0314 08:46:49.552752 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" podStartSLOduration=41.552726672 podStartE2EDuration="41.552726672s" podCreationTimestamp="2026-03-14 08:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:46:49.546410791 +0000 UTC m=+1144.794862478" watchObservedRunningTime="2026-03-14 08:46:49.552726672 +0000 UTC m=+1144.801178329" Mar 14 08:46:49 crc kubenswrapper[4886]: E0314 08:46:49.929990 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 14 08:46:49 crc kubenswrapper[4886]: E0314 08:46:49.930506 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7prp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-2ghpz_openstack-operators(e579f2af-b976-45fe-ac83-a23c0676eaf2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:46:49 crc kubenswrapper[4886]: E0314 08:46:49.931675 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" podUID="e579f2af-b976-45fe-ac83-a23c0676eaf2" Mar 14 08:46:50 crc kubenswrapper[4886]: I0314 08:46:50.540319 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" event={"ID":"28443df6-3421-46cb-9011-d8b47769fbfa","Type":"ContainerStarted","Data":"ab08055bd2408983fdceee6f04e6b0c93d389da200ee2659db98aa8f575edb7e"} Mar 14 08:46:50 crc kubenswrapper[4886]: I0314 08:46:50.540542 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:46:50 crc kubenswrapper[4886]: I0314 08:46:50.561906 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" podStartSLOduration=2.560231862 podStartE2EDuration="42.56188523s" podCreationTimestamp="2026-03-14 08:46:08 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.930180486 +0000 UTC m=+1105.178632123" lastFinishedPulling="2026-03-14 08:46:49.931833854 +0000 UTC m=+1145.180285491" observedRunningTime="2026-03-14 08:46:50.55667005 +0000 UTC m=+1145.805121697" watchObservedRunningTime="2026-03-14 08:46:50.56188523 +0000 UTC m=+1145.810336887" Mar 14 08:46:51 crc kubenswrapper[4886]: I0314 08:46:51.548739 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" event={"ID":"4ac77887-a632-454d-9460-1150a439045a","Type":"ContainerStarted","Data":"4e9200e9ec1f7d5abffbc8c024289c4a0bbdb0fd9d7f41b64a538ba64772ac13"} Mar 14 08:46:51 crc kubenswrapper[4886]: I0314 08:46:51.549226 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:51 crc kubenswrapper[4886]: I0314 08:46:51.566534 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" podStartSLOduration=2.673584182 podStartE2EDuration="44.566516576s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:08.925084057 +0000 UTC m=+1104.173535694" lastFinishedPulling="2026-03-14 08:46:50.818016441 +0000 UTC m=+1146.066468088" observedRunningTime="2026-03-14 08:46:51.564800207 +0000 UTC m=+1146.813251854" watchObservedRunningTime="2026-03-14 08:46:51.566516576 +0000 UTC m=+1146.814968213" Mar 14 08:46:55 crc kubenswrapper[4886]: I0314 08:46:55.587745 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" event={"ID":"84978a63-3814-485e-9902-7d041f79179d","Type":"ContainerStarted","Data":"588ad410b1c5bb43affc5ecaa58866f5f3e30ff73210dad2a5a7b8827450b3c7"} Mar 14 08:46:55 crc kubenswrapper[4886]: I0314 08:46:55.588396 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:46:55 crc kubenswrapper[4886]: I0314 08:46:55.591370 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" event={"ID":"7c968620-eafc-42fe-b2dc-a86b4fa845d5","Type":"ContainerStarted","Data":"9f8c9565dabbe4792189e433639fcf84998be61307fa9c7489a3b82d8b760698"} Mar 14 08:46:55 crc kubenswrapper[4886]: I0314 08:46:55.591492 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:46:55 crc kubenswrapper[4886]: I0314 08:46:55.608699 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" podStartSLOduration=42.372462888 podStartE2EDuration="48.608683143s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:48.709402405 +0000 UTC m=+1143.957854042" lastFinishedPulling="2026-03-14 08:46:54.94562266 +0000 UTC m=+1150.194074297" observedRunningTime="2026-03-14 08:46:55.605614215 +0000 UTC m=+1150.854065852" watchObservedRunningTime="2026-03-14 08:46:55.608683143 +0000 UTC m=+1150.857134780" Mar 14 08:46:55 crc kubenswrapper[4886]: I0314 08:46:55.637314 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" podStartSLOduration=42.37671272 podStartE2EDuration="48.637294595s" podCreationTimestamp="2026-03-14 08:46:07 +0000 UTC" firstStartedPulling="2026-03-14 08:46:48.66186766 +0000 UTC m=+1143.910319297" lastFinishedPulling="2026-03-14 08:46:54.922449505 +0000 UTC m=+1150.170901172" observedRunningTime="2026-03-14 08:46:55.634934047 +0000 UTC m=+1150.883385694" watchObservedRunningTime="2026-03-14 08:46:55.637294595 +0000 UTC m=+1150.885746232" Mar 14 08:46:56 crc kubenswrapper[4886]: I0314 08:46:56.067422 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:46:56 crc kubenswrapper[4886]: I0314 08:46:56.067488 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:46:58 crc kubenswrapper[4886]: I0314 08:46:58.036787 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-2knnh" Mar 14 08:46:58 crc kubenswrapper[4886]: I0314 08:46:58.764440 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-ssmck" Mar 14 08:47:00 crc kubenswrapper[4886]: I0314 08:47:00.018625 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-nh5lk" Mar 14 08:47:00 crc kubenswrapper[4886]: I0314 08:47:00.472978 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn" Mar 14 08:47:00 crc kubenswrapper[4886]: I0314 08:47:00.671939 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65755f6b77-wdwbk" Mar 14 08:47:01 crc kubenswrapper[4886]: E0314 08:47:01.428276 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" podUID="e579f2af-b976-45fe-ac83-a23c0676eaf2" Mar 14 08:47:16 crc kubenswrapper[4886]: I0314 08:47:16.752217 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" event={"ID":"e579f2af-b976-45fe-ac83-a23c0676eaf2","Type":"ContainerStarted","Data":"2b1d2dacfc3540477ee8773050d867c6f62a9abe71558fd8af27b62fcf664129"} Mar 14 08:47:16 crc kubenswrapper[4886]: I0314 08:47:16.774280 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2ghpz" podStartSLOduration=2.74802283 podStartE2EDuration="1m8.774260464s" podCreationTimestamp="2026-03-14 08:46:08 +0000 UTC" firstStartedPulling="2026-03-14 08:46:09.915286668 +0000 UTC m=+1105.163738305" lastFinishedPulling="2026-03-14 08:47:15.941524302 +0000 UTC m=+1171.189975939" observedRunningTime="2026-03-14 08:47:16.766474171 +0000 UTC m=+1172.014925818" watchObservedRunningTime="2026-03-14 08:47:16.774260464 +0000 UTC m=+1172.022712101" Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.065690 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.066225 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.066270 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.066688 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7738a099ca236f81766457fa9d5c5fb3f046ded018935c8fbb545666a40042f4"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.066747 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://7738a099ca236f81766457fa9d5c5fb3f046ded018935c8fbb545666a40042f4" gracePeriod=600 Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.835243 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="7738a099ca236f81766457fa9d5c5fb3f046ded018935c8fbb545666a40042f4" exitCode=0 Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.835293 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"7738a099ca236f81766457fa9d5c5fb3f046ded018935c8fbb545666a40042f4"} Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.835575 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"55e54dc7fbdd549134d3b20dfd9642dda565b3dd8cfe4e3b853534c01d92f8db"} Mar 14 08:47:26 crc kubenswrapper[4886]: I0314 08:47:26.835598 4886 scope.go:117] "RemoveContainer" containerID="c3f165f2b40174eab0175613b347b88b554cd9e063558e15142f42dfea385fce" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.844599 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6vf97"] Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.846065 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.848848 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.849007 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.849135 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.849802 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xck2b" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.873545 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6vf97"] Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.884695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bda7203-17fd-4894-bb89-b42a36d31466-config\") pod \"dnsmasq-dns-675f4bcbfc-6vf97\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.884787 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqb94\" (UniqueName: \"kubernetes.io/projected/4bda7203-17fd-4894-bb89-b42a36d31466-kube-api-access-cqb94\") pod \"dnsmasq-dns-675f4bcbfc-6vf97\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.985530 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqb94\" (UniqueName: \"kubernetes.io/projected/4bda7203-17fd-4894-bb89-b42a36d31466-kube-api-access-cqb94\") pod \"dnsmasq-dns-675f4bcbfc-6vf97\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.985621 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bda7203-17fd-4894-bb89-b42a36d31466-config\") pod \"dnsmasq-dns-675f4bcbfc-6vf97\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:35 crc kubenswrapper[4886]: I0314 08:47:35.986459 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bda7203-17fd-4894-bb89-b42a36d31466-config\") pod \"dnsmasq-dns-675f4bcbfc-6vf97\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.013176 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7gxsp"] Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.014278 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.019920 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.043661 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7gxsp"] Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.047282 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqb94\" (UniqueName: \"kubernetes.io/projected/4bda7203-17fd-4894-bb89-b42a36d31466-kube-api-access-cqb94\") pod \"dnsmasq-dns-675f4bcbfc-6vf97\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.086358 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.086437 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5cvq\" (UniqueName: \"kubernetes.io/projected/0a06abf4-5dbb-4bf8-930e-c99893c25410-kube-api-access-l5cvq\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.086503 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-config\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.164752 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.191032 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.191136 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5cvq\" (UniqueName: \"kubernetes.io/projected/0a06abf4-5dbb-4bf8-930e-c99893c25410-kube-api-access-l5cvq\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.191173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-config\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.191892 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-config\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.192451 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.210477 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5cvq\" (UniqueName: \"kubernetes.io/projected/0a06abf4-5dbb-4bf8-930e-c99893c25410-kube-api-access-l5cvq\") pod \"dnsmasq-dns-78dd6ddcc-7gxsp\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.358513 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.452885 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6vf97"] Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.767163 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7gxsp"] Mar 14 08:47:36 crc kubenswrapper[4886]: W0314 08:47:36.775664 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a06abf4_5dbb_4bf8_930e_c99893c25410.slice/crio-05802828aba08874b2293edc1cc2ad12ad24eb99622cb77709c7b74d7a73bede WatchSource:0}: Error finding container 05802828aba08874b2293edc1cc2ad12ad24eb99622cb77709c7b74d7a73bede: Status 404 returned error can't find the container with id 05802828aba08874b2293edc1cc2ad12ad24eb99622cb77709c7b74d7a73bede Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.938897 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" event={"ID":"0a06abf4-5dbb-4bf8-930e-c99893c25410","Type":"ContainerStarted","Data":"05802828aba08874b2293edc1cc2ad12ad24eb99622cb77709c7b74d7a73bede"} Mar 14 08:47:36 crc kubenswrapper[4886]: I0314 08:47:36.940535 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" event={"ID":"4bda7203-17fd-4894-bb89-b42a36d31466","Type":"ContainerStarted","Data":"78393b174f524a547b1dd4a6a49224f62b5d5842d3ae1c07de0871fff41caef7"} Mar 14 08:47:37 crc kubenswrapper[4886]: I0314 08:47:37.974168 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6vf97"] Mar 14 08:47:37 crc kubenswrapper[4886]: I0314 08:47:37.994738 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wfhhk"] Mar 14 08:47:37 crc kubenswrapper[4886]: I0314 08:47:37.995928 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.009598 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wfhhk"] Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.133273 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pj78\" (UniqueName: \"kubernetes.io/projected/32d7ee30-d1ca-45c0-a3e7-429463827560-kube-api-access-2pj78\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.133349 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.133389 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-config\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.237139 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pj78\" (UniqueName: \"kubernetes.io/projected/32d7ee30-d1ca-45c0-a3e7-429463827560-kube-api-access-2pj78\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.237238 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.237284 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-config\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.238197 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-config\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.238876 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.273889 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pj78\" (UniqueName: \"kubernetes.io/projected/32d7ee30-d1ca-45c0-a3e7-429463827560-kube-api-access-2pj78\") pod \"dnsmasq-dns-666b6646f7-wfhhk\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.334946 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.662854 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7gxsp"] Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.726111 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wm2x"] Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.729987 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.744657 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wm2x"] Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.748849 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlwl\" (UniqueName: \"kubernetes.io/projected/103e92b6-e1e8-4a10-8cbd-76d94038132d-kube-api-access-fjlwl\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.748982 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-config\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.749074 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.851665 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlwl\" (UniqueName: \"kubernetes.io/projected/103e92b6-e1e8-4a10-8cbd-76d94038132d-kube-api-access-fjlwl\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.851707 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-config\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.851745 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.852606 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.856571 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-config\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:38 crc kubenswrapper[4886]: I0314 08:47:38.877552 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlwl\" (UniqueName: \"kubernetes.io/projected/103e92b6-e1e8-4a10-8cbd-76d94038132d-kube-api-access-fjlwl\") pod \"dnsmasq-dns-57d769cc4f-8wm2x\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.067288 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.127482 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.128953 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.133211 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.133433 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.133531 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.135386 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qt8jt" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.135763 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.135848 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.137015 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.146475 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wfhhk"] Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.153082 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166307 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166392 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166414 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c08d9078-9b3a-492a-92db-3096453d49f8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166455 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166607 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166730 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-config-data\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166765 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: W0314 08:47:39.166767 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32d7ee30_d1ca_45c0_a3e7_429463827560.slice/crio-a980fc4846653ea2e0cdd5afdb770eff49e4fb8dbcbf2b60081cd5f50e106f97 WatchSource:0}: Error finding container a980fc4846653ea2e0cdd5afdb770eff49e4fb8dbcbf2b60081cd5f50e106f97: Status 404 returned error can't find the container with id a980fc4846653ea2e0cdd5afdb770eff49e4fb8dbcbf2b60081cd5f50e106f97 Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166802 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166862 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-kube-api-access-plz57\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.166918 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c08d9078-9b3a-492a-92db-3096453d49f8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269161 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269213 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-kube-api-access-plz57\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269254 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c08d9078-9b3a-492a-92db-3096453d49f8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269313 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269342 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269362 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c08d9078-9b3a-492a-92db-3096453d49f8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269393 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269423 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269455 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-config-data\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269479 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.269508 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.271068 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.271401 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.271554 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.271965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.271987 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.272465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-config-data\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.277993 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.278102 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.295586 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-kube-api-access-plz57\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.308553 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c08d9078-9b3a-492a-92db-3096453d49f8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.318498 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.325919 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c08d9078-9b3a-492a-92db-3096453d49f8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.505859 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.637555 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wm2x"] Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.860726 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.862686 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.865285 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-n4g2l" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.865554 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.865990 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.866010 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.866497 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.869577 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.871284 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.872654 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994187 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/68bf3729-3dcf-4881-814b-b6af3060336e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994496 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hdj\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-kube-api-access-b6hdj\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994524 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994562 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994577 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994600 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994620 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994684 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994703 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994741 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/68bf3729-3dcf-4881-814b-b6af3060336e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:39 crc kubenswrapper[4886]: I0314 08:47:39.994757 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.001372 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" event={"ID":"32d7ee30-d1ca-45c0-a3e7-429463827560","Type":"ContainerStarted","Data":"a980fc4846653ea2e0cdd5afdb770eff49e4fb8dbcbf2b60081cd5f50e106f97"} Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.003362 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" event={"ID":"103e92b6-e1e8-4a10-8cbd-76d94038132d","Type":"ContainerStarted","Data":"47ae459730dcebcbc22e34f6f97288415c2cf85e365e13692771930e36537556"} Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.079245 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.095859 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/68bf3729-3dcf-4881-814b-b6af3060336e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.095943 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.095984 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/68bf3729-3dcf-4881-814b-b6af3060336e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096004 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hdj\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-kube-api-access-b6hdj\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096035 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096077 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096102 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096145 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096238 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.096263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.097259 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.099175 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.099499 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.104354 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.106874 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.107631 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/68bf3729-3dcf-4881-814b-b6af3060336e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.107747 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.109066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/68bf3729-3dcf-4881-814b-b6af3060336e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.117456 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.118161 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.120556 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hdj\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-kube-api-access-b6hdj\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.129095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.195920 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:47:40 crc kubenswrapper[4886]: I0314 08:47:40.728313 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.020990 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.023047 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.033925 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.035688 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-g6sn4" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.036212 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.036681 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.037297 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.043852 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.080157 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c08d9078-9b3a-492a-92db-3096453d49f8","Type":"ContainerStarted","Data":"1c6276ee82d2427410054dd2bd2baf87d6a096eb614c1c7e7715844ad1c519f3"} Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.082566 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"68bf3729-3dcf-4881-814b-b6af3060336e","Type":"ContainerStarted","Data":"ccd29521a8afa6a332dbc78c9335df5371474150d3df80c1c2177a8d8f9800ec"} Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.113440 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.113481 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.113508 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.113535 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.114046 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.114183 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.114251 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nxt7\" (UniqueName: \"kubernetes.io/projected/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-kube-api-access-6nxt7\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.114487 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216097 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216198 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216218 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216241 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216264 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216311 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216330 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216350 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nxt7\" (UniqueName: \"kubernetes.io/projected/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-kube-api-access-6nxt7\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.216952 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.217173 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.217220 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.217477 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.218291 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.241337 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.243454 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.243839 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nxt7\" (UniqueName: \"kubernetes.io/projected/0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b-kube-api-access-6nxt7\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.257136 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b\") " pod="openstack/openstack-galera-0" Mar 14 08:47:41 crc kubenswrapper[4886]: I0314 08:47:41.418573 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.531217 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.535107 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.544937 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7rzl6" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.545076 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.545258 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.545384 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.561098 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.594857 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.596323 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.599037 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.599415 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.609357 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zmlmz" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.611386 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653215 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnj7p\" (UniqueName: \"kubernetes.io/projected/211eef94-8537-4eaa-aae0-58b9697c7fac-kube-api-access-qnj7p\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653266 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653298 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653322 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/211eef94-8537-4eaa-aae0-58b9697c7fac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653339 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/211eef94-8537-4eaa-aae0-58b9697c7fac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653382 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211eef94-8537-4eaa-aae0-58b9697c7fac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.653435 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754588 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frt2\" (UniqueName: \"kubernetes.io/projected/17dec532-a4ac-466d-9b81-29a3e27c33bb-kube-api-access-6frt2\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754650 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211eef94-8537-4eaa-aae0-58b9697c7fac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754671 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754694 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17dec532-a4ac-466d-9b81-29a3e27c33bb-kolla-config\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754783 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnj7p\" (UniqueName: \"kubernetes.io/projected/211eef94-8537-4eaa-aae0-58b9697c7fac-kube-api-access-qnj7p\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754819 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754847 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754869 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/17dec532-a4ac-466d-9b81-29a3e27c33bb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754891 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/211eef94-8537-4eaa-aae0-58b9697c7fac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754933 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/211eef94-8537-4eaa-aae0-58b9697c7fac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.754961 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17dec532-a4ac-466d-9b81-29a3e27c33bb-config-data\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.755061 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dec532-a4ac-466d-9b81-29a3e27c33bb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.755098 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.756752 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.757813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.758142 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.759000 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/211eef94-8537-4eaa-aae0-58b9697c7fac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.759161 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/211eef94-8537-4eaa-aae0-58b9697c7fac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.761353 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211eef94-8537-4eaa-aae0-58b9697c7fac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.762179 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/211eef94-8537-4eaa-aae0-58b9697c7fac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.783744 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnj7p\" (UniqueName: \"kubernetes.io/projected/211eef94-8537-4eaa-aae0-58b9697c7fac-kube-api-access-qnj7p\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.783904 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"211eef94-8537-4eaa-aae0-58b9697c7fac\") " pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.856685 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/17dec532-a4ac-466d-9b81-29a3e27c33bb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.856795 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17dec532-a4ac-466d-9b81-29a3e27c33bb-config-data\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.856820 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dec532-a4ac-466d-9b81-29a3e27c33bb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.856921 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frt2\" (UniqueName: \"kubernetes.io/projected/17dec532-a4ac-466d-9b81-29a3e27c33bb-kube-api-access-6frt2\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.856997 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17dec532-a4ac-466d-9b81-29a3e27c33bb-kolla-config\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.857911 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17dec532-a4ac-466d-9b81-29a3e27c33bb-config-data\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.858468 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/17dec532-a4ac-466d-9b81-29a3e27c33bb-kolla-config\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.860700 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dec532-a4ac-466d-9b81-29a3e27c33bb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.864678 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.870024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/17dec532-a4ac-466d-9b81-29a3e27c33bb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.873167 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frt2\" (UniqueName: \"kubernetes.io/projected/17dec532-a4ac-466d-9b81-29a3e27c33bb-kube-api-access-6frt2\") pod \"memcached-0\" (UID: \"17dec532-a4ac-466d-9b81-29a3e27c33bb\") " pod="openstack/memcached-0" Mar 14 08:47:42 crc kubenswrapper[4886]: I0314 08:47:42.916277 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 14 08:47:44 crc kubenswrapper[4886]: I0314 08:47:44.920245 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:47:44 crc kubenswrapper[4886]: I0314 08:47:44.921599 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 08:47:44 crc kubenswrapper[4886]: I0314 08:47:44.925464 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jwb8j" Mar 14 08:47:44 crc kubenswrapper[4886]: I0314 08:47:44.932102 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:47:45 crc kubenswrapper[4886]: I0314 08:47:45.101207 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjqg\" (UniqueName: \"kubernetes.io/projected/9dafb8d0-a7d6-4426-a52d-19408d20b8b3-kube-api-access-kjjqg\") pod \"kube-state-metrics-0\" (UID: \"9dafb8d0-a7d6-4426-a52d-19408d20b8b3\") " pod="openstack/kube-state-metrics-0" Mar 14 08:47:45 crc kubenswrapper[4886]: I0314 08:47:45.202340 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjqg\" (UniqueName: \"kubernetes.io/projected/9dafb8d0-a7d6-4426-a52d-19408d20b8b3-kube-api-access-kjjqg\") pod \"kube-state-metrics-0\" (UID: \"9dafb8d0-a7d6-4426-a52d-19408d20b8b3\") " pod="openstack/kube-state-metrics-0" Mar 14 08:47:45 crc kubenswrapper[4886]: I0314 08:47:45.225608 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjqg\" (UniqueName: \"kubernetes.io/projected/9dafb8d0-a7d6-4426-a52d-19408d20b8b3-kube-api-access-kjjqg\") pod \"kube-state-metrics-0\" (UID: \"9dafb8d0-a7d6-4426-a52d-19408d20b8b3\") " pod="openstack/kube-state-metrics-0" Mar 14 08:47:45 crc kubenswrapper[4886]: I0314 08:47:45.243267 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.257482 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.259464 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.261887 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.262910 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.263080 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.263262 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.263396 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.263515 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nz9cv" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.263618 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.271926 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.291830 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.421621 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28d6b363-8881-407e-b8e4-9fd7863b881c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.421665 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.421695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.421718 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.421867 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.422062 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.422181 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.422215 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxxb\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-kube-api-access-kqxxb\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.422354 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.422390 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-config\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523646 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523718 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523744 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxxb\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-kube-api-access-kqxxb\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523793 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523835 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-config\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523859 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28d6b363-8881-407e-b8e4-9fd7863b881c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523878 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523924 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.523964 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.524006 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.524881 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.525239 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.525723 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.528228 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.528405 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.529060 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.529087 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/915b9721c137ba3e3acc5e7d0fcf048ab5161bf9eea8563b49a63a650ee09ff7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.530205 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.530420 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28d6b363-8881-407e-b8e4-9fd7863b881c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.533949 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-config\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.540915 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxxb\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-kube-api-access-kqxxb\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.581504 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:46 crc kubenswrapper[4886]: I0314 08:47:46.585561 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.111850 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9k99l"] Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.113450 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.116799 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.117026 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xb2b8" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.117330 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.122791 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-slbpm"] Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.124989 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.138469 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9k99l"] Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169557 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee9c638-1703-4b56-b366-13c6746d035c-combined-ca-bundle\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169630 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee9c638-1703-4b56-b366-13c6746d035c-ovn-controller-tls-certs\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169659 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmxq8\" (UniqueName: \"kubernetes.io/projected/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-kube-api-access-jmxq8\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169684 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-run\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169731 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-lib\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-scripts\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169796 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-run\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169825 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee9c638-1703-4b56-b366-13c6746d035c-scripts\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169857 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-run-ovn\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169888 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-log\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.169963 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtd8q\" (UniqueName: \"kubernetes.io/projected/dee9c638-1703-4b56-b366-13c6746d035c-kube-api-access-rtd8q\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.170015 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-etc-ovs\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.170078 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-log-ovn\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.182543 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-slbpm"] Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271567 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-run\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271630 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-lib\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271655 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-scripts\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271672 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-run\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271698 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee9c638-1703-4b56-b366-13c6746d035c-scripts\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271716 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-run-ovn\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271735 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-log\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271780 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtd8q\" (UniqueName: \"kubernetes.io/projected/dee9c638-1703-4b56-b366-13c6746d035c-kube-api-access-rtd8q\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271806 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-etc-ovs\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271835 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-log-ovn\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271866 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee9c638-1703-4b56-b366-13c6746d035c-combined-ca-bundle\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271890 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee9c638-1703-4b56-b366-13c6746d035c-ovn-controller-tls-certs\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.271910 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmxq8\" (UniqueName: \"kubernetes.io/projected/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-kube-api-access-jmxq8\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.272712 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-run\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.272831 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-lib\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.273629 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-run\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.274575 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-scripts\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.274710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-etc-ovs\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.274822 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-log-ovn\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.276008 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dee9c638-1703-4b56-b366-13c6746d035c-scripts\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.276152 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dee9c638-1703-4b56-b366-13c6746d035c-var-run-ovn\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.276259 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-var-log\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.279922 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee9c638-1703-4b56-b366-13c6746d035c-combined-ca-bundle\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.280320 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee9c638-1703-4b56-b366-13c6746d035c-ovn-controller-tls-certs\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.301085 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmxq8\" (UniqueName: \"kubernetes.io/projected/fa0129a2-aafc-4df4-9376-217bc5b6ee9c-kube-api-access-jmxq8\") pod \"ovn-controller-ovs-slbpm\" (UID: \"fa0129a2-aafc-4df4-9376-217bc5b6ee9c\") " pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.318464 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtd8q\" (UniqueName: \"kubernetes.io/projected/dee9c638-1703-4b56-b366-13c6746d035c-kube-api-access-rtd8q\") pod \"ovn-controller-9k99l\" (UID: \"dee9c638-1703-4b56-b366-13c6746d035c\") " pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.485584 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l" Mar 14 08:47:48 crc kubenswrapper[4886]: I0314 08:47:48.501848 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.010735 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.013781 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.018344 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.019007 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-n22c9" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.019158 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.019287 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.019374 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.027577 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.185486 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.185530 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.185582 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.185759 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-config\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.185912 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.185970 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzkmw\" (UniqueName: \"kubernetes.io/projected/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-kube-api-access-hzkmw\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.186005 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.186037 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287331 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287376 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287411 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287454 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-config\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287518 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287549 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzkmw\" (UniqueName: \"kubernetes.io/projected/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-kube-api-access-hzkmw\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.287601 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.288174 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.288239 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.289694 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.290636 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-config\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.291927 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.292083 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.302554 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.311005 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzkmw\" (UniqueName: \"kubernetes.io/projected/014d60c5-6fb9-4259-8f9c-3ff44ff6781c-kube-api-access-hzkmw\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.322479 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"014d60c5-6fb9-4259-8f9c-3ff44ff6781c\") " pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:49 crc kubenswrapper[4886]: I0314 08:47:49.333634 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.280420 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.283916 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.292379 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.292628 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fqgbz" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.292814 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.293878 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.301993 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464044 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464098 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f12a35c8-acb9-4410-9ebe-112b7c51885e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464131 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12a35c8-acb9-4410-9ebe-112b7c51885e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chdj4\" (UniqueName: \"kubernetes.io/projected/f12a35c8-acb9-4410-9ebe-112b7c51885e-kube-api-access-chdj4\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464177 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464207 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464289 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.464318 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f12a35c8-acb9-4410-9ebe-112b7c51885e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.565799 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.565877 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f12a35c8-acb9-4410-9ebe-112b7c51885e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.565917 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.565958 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f12a35c8-acb9-4410-9ebe-112b7c51885e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.565981 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12a35c8-acb9-4410-9ebe-112b7c51885e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.566032 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chdj4\" (UniqueName: \"kubernetes.io/projected/f12a35c8-acb9-4410-9ebe-112b7c51885e-kube-api-access-chdj4\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.566055 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.566095 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.566702 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.568018 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f12a35c8-acb9-4410-9ebe-112b7c51885e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.568398 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12a35c8-acb9-4410-9ebe-112b7c51885e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.568445 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f12a35c8-acb9-4410-9ebe-112b7c51885e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.576882 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.576893 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.581470 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f12a35c8-acb9-4410-9ebe-112b7c51885e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.584180 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chdj4\" (UniqueName: \"kubernetes.io/projected/f12a35c8-acb9-4410-9ebe-112b7c51885e-kube-api-access-chdj4\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.589042 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f12a35c8-acb9-4410-9ebe-112b7c51885e\") " pod="openstack/ovsdbserver-sb-0" Mar 14 08:47:52 crc kubenswrapper[4886]: I0314 08:47:52.609748 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.140275 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557968-7qmjv"] Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.141699 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.147825 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.148245 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.148449 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.162412 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-7qmjv"] Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.311039 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52ht\" (UniqueName: \"kubernetes.io/projected/710fbba0-b6e6-4d2e-a3be-f66a11491a0f-kube-api-access-d52ht\") pod \"auto-csr-approver-29557968-7qmjv\" (UID: \"710fbba0-b6e6-4d2e-a3be-f66a11491a0f\") " pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.413401 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52ht\" (UniqueName: \"kubernetes.io/projected/710fbba0-b6e6-4d2e-a3be-f66a11491a0f-kube-api-access-d52ht\") pod \"auto-csr-approver-29557968-7qmjv\" (UID: \"710fbba0-b6e6-4d2e-a3be-f66a11491a0f\") " pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.439241 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52ht\" (UniqueName: \"kubernetes.io/projected/710fbba0-b6e6-4d2e-a3be-f66a11491a0f-kube-api-access-d52ht\") pod \"auto-csr-approver-29557968-7qmjv\" (UID: \"710fbba0-b6e6-4d2e-a3be-f66a11491a0f\") " pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:00 crc kubenswrapper[4886]: I0314 08:48:00.466476 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:07 crc kubenswrapper[4886]: I0314 08:48:07.403355 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.859459 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.859612 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5cvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7gxsp_openstack(0a06abf4-5dbb-4bf8-930e-c99893c25410): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.861378 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" podUID="0a06abf4-5dbb-4bf8-930e-c99893c25410" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.908833 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.909007 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cqb94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6vf97_openstack(4bda7203-17fd-4894-bb89-b42a36d31466): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.910604 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" podUID="4bda7203-17fd-4894-bb89-b42a36d31466" Mar 14 08:48:07 crc kubenswrapper[4886]: I0314 08:48:07.959950 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.989301 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.989545 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjlwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-8wm2x_openstack(103e92b6-e1e8-4a10-8cbd-76d94038132d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:48:07 crc kubenswrapper[4886]: E0314 08:48:07.990833 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" Mar 14 08:48:08 crc kubenswrapper[4886]: E0314 08:48:08.008354 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 14 08:48:08 crc kubenswrapper[4886]: E0314 08:48:08.008565 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pj78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-wfhhk_openstack(32d7ee30-d1ca-45c0-a3e7-429463827560): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:48:08 crc kubenswrapper[4886]: E0314 08:48:08.009795 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.298446 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.379008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dafb8d0-a7d6-4426-a52d-19408d20b8b3","Type":"ContainerStarted","Data":"ba0eefbc0f806efb0d8b92d434cfea8e1e12f14165379848bb0594a2a72868e6"} Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.380424 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"211eef94-8537-4eaa-aae0-58b9697c7fac","Type":"ContainerStarted","Data":"e6260c2140e589d9b795536a9a48a7ecd03b454be866464c61e5dae4f4429a81"} Mar 14 08:48:08 crc kubenswrapper[4886]: E0314 08:48:08.382330 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" Mar 14 08:48:08 crc kubenswrapper[4886]: E0314 08:48:08.382433 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.550763 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9k99l"] Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.560667 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 14 08:48:08 crc kubenswrapper[4886]: W0314 08:48:08.561078 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee9c638_1703_4b56_b366_13c6746d035c.slice/crio-7c86c7b8753d90d2244c881d74279d8231b6e182104fe419f94c1ee2404968c3 WatchSource:0}: Error finding container 7c86c7b8753d90d2244c881d74279d8231b6e182104fe419f94c1ee2404968c3: Status 404 returned error can't find the container with id 7c86c7b8753d90d2244c881d74279d8231b6e182104fe419f94c1ee2404968c3 Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.578469 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.586041 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 14 08:48:08 crc kubenswrapper[4886]: W0314 08:48:08.592537 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d6b363_8881_407e_b8e4_9fd7863b881c.slice/crio-c927307ec66c3e694d4180edc82b0b3f71713f23eacb7292f45314299d13c6a8 WatchSource:0}: Error finding container c927307ec66c3e694d4180edc82b0b3f71713f23eacb7292f45314299d13c6a8: Status 404 returned error can't find the container with id c927307ec66c3e694d4180edc82b0b3f71713f23eacb7292f45314299d13c6a8 Mar 14 08:48:08 crc kubenswrapper[4886]: W0314 08:48:08.596253 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17dec532_a4ac_466d_9b81_29a3e27c33bb.slice/crio-ab4ee24406449c5a43a4504930eac86f305f2ea37e459ade3f3fff1316101580 WatchSource:0}: Error finding container ab4ee24406449c5a43a4504930eac86f305f2ea37e459ade3f3fff1316101580: Status 404 returned error can't find the container with id ab4ee24406449c5a43a4504930eac86f305f2ea37e459ade3f3fff1316101580 Mar 14 08:48:08 crc kubenswrapper[4886]: I0314 08:48:08.971763 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-7qmjv"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.094719 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.200978 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-slbpm"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.225936 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.232929 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:48:09 crc kubenswrapper[4886]: W0314 08:48:09.325394 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0129a2_aafc_4df4_9376_217bc5b6ee9c.slice/crio-af76dc4695d750b374448793e04dfdb05ef04589b4e2992907e6ba7fd83c796b WatchSource:0}: Error finding container af76dc4695d750b374448793e04dfdb05ef04589b4e2992907e6ba7fd83c796b: Status 404 returned error can't find the container with id af76dc4695d750b374448793e04dfdb05ef04589b4e2992907e6ba7fd83c796b Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.418428 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqb94\" (UniqueName: \"kubernetes.io/projected/4bda7203-17fd-4894-bb89-b42a36d31466-kube-api-access-cqb94\") pod \"4bda7203-17fd-4894-bb89-b42a36d31466\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.418488 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-dns-svc\") pod \"0a06abf4-5dbb-4bf8-930e-c99893c25410\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.418526 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bda7203-17fd-4894-bb89-b42a36d31466-config\") pod \"4bda7203-17fd-4894-bb89-b42a36d31466\" (UID: \"4bda7203-17fd-4894-bb89-b42a36d31466\") " Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.418605 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-config\") pod \"0a06abf4-5dbb-4bf8-930e-c99893c25410\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.418640 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5cvq\" (UniqueName: \"kubernetes.io/projected/0a06abf4-5dbb-4bf8-930e-c99893c25410-kube-api-access-l5cvq\") pod \"0a06abf4-5dbb-4bf8-930e-c99893c25410\" (UID: \"0a06abf4-5dbb-4bf8-930e-c99893c25410\") " Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.419546 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a06abf4-5dbb-4bf8-930e-c99893c25410" (UID: "0a06abf4-5dbb-4bf8-930e-c99893c25410"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.423267 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bda7203-17fd-4894-bb89-b42a36d31466-config" (OuterVolumeSpecName: "config") pod "4bda7203-17fd-4894-bb89-b42a36d31466" (UID: "4bda7203-17fd-4894-bb89-b42a36d31466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.423599 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-config" (OuterVolumeSpecName: "config") pod "0a06abf4-5dbb-4bf8-930e-c99893c25410" (UID: "0a06abf4-5dbb-4bf8-930e-c99893c25410"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.430361 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bda7203-17fd-4894-bb89-b42a36d31466-kube-api-access-cqb94" (OuterVolumeSpecName: "kube-api-access-cqb94") pod "4bda7203-17fd-4894-bb89-b42a36d31466" (UID: "4bda7203-17fd-4894-bb89-b42a36d31466"). InnerVolumeSpecName "kube-api-access-cqb94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.443022 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a06abf4-5dbb-4bf8-930e-c99893c25410-kube-api-access-l5cvq" (OuterVolumeSpecName: "kube-api-access-l5cvq") pod "0a06abf4-5dbb-4bf8-930e-c99893c25410" (UID: "0a06abf4-5dbb-4bf8-930e-c99893c25410"). InnerVolumeSpecName "kube-api-access-l5cvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.445550 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.454183 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerStarted","Data":"c927307ec66c3e694d4180edc82b0b3f71713f23eacb7292f45314299d13c6a8"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.454529 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6vf97" event={"ID":"4bda7203-17fd-4894-bb89-b42a36d31466","Type":"ContainerDied","Data":"78393b174f524a547b1dd4a6a49224f62b5d5842d3ae1c07de0871fff41caef7"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.498637 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" event={"ID":"0a06abf4-5dbb-4bf8-930e-c99893c25410","Type":"ContainerDied","Data":"05802828aba08874b2293edc1cc2ad12ad24eb99622cb77709c7b74d7a73bede"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.498718 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7gxsp" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.504405 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" event={"ID":"710fbba0-b6e6-4d2e-a3be-f66a11491a0f","Type":"ContainerStarted","Data":"14d74df5d86f66528ca67ade689f8d555cd027b49cea294ed21045433e1eca68"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.509299 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"17dec532-a4ac-466d-9b81-29a3e27c33bb","Type":"ContainerStarted","Data":"ab4ee24406449c5a43a4504930eac86f305f2ea37e459ade3f3fff1316101580"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.516655 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c08d9078-9b3a-492a-92db-3096453d49f8","Type":"ContainerStarted","Data":"548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.526699 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqb94\" (UniqueName: \"kubernetes.io/projected/4bda7203-17fd-4894-bb89-b42a36d31466-kube-api-access-cqb94\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.526726 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.526749 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bda7203-17fd-4894-bb89-b42a36d31466-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.526759 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a06abf4-5dbb-4bf8-930e-c99893c25410-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.526770 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5cvq\" (UniqueName: \"kubernetes.io/projected/0a06abf4-5dbb-4bf8-930e-c99893c25410-kube-api-access-l5cvq\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.543429 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b","Type":"ContainerStarted","Data":"69c94bab84755fffe1db342f43a54aed0d5057d2cf597ca15a6b94c71b3eaf9d"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.587939 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f12a35c8-acb9-4410-9ebe-112b7c51885e","Type":"ContainerStarted","Data":"e8b72de925db59a9b23d1afabe8144d6aa6008e90359576ba8bb628260e093e8"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.590318 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6vf97"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.621647 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l" event={"ID":"dee9c638-1703-4b56-b366-13c6746d035c","Type":"ContainerStarted","Data":"7c86c7b8753d90d2244c881d74279d8231b6e182104fe419f94c1ee2404968c3"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.622830 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6vf97"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.650196 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-slbpm" event={"ID":"fa0129a2-aafc-4df4-9376-217bc5b6ee9c","Type":"ContainerStarted","Data":"af76dc4695d750b374448793e04dfdb05ef04589b4e2992907e6ba7fd83c796b"} Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.708254 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7gxsp"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.717499 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7gxsp"] Mar 14 08:48:09 crc kubenswrapper[4886]: I0314 08:48:09.932459 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 14 08:48:10 crc kubenswrapper[4886]: W0314 08:48:10.122359 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod014d60c5_6fb9_4259_8f9c_3ff44ff6781c.slice/crio-dbc61a5b29ccd9357542aea2d4b26e58dfe531bda60bb6c5ad9d986171037794 WatchSource:0}: Error finding container dbc61a5b29ccd9357542aea2d4b26e58dfe531bda60bb6c5ad9d986171037794: Status 404 returned error can't find the container with id dbc61a5b29ccd9357542aea2d4b26e58dfe531bda60bb6c5ad9d986171037794 Mar 14 08:48:10 crc kubenswrapper[4886]: I0314 08:48:10.660073 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"014d60c5-6fb9-4259-8f9c-3ff44ff6781c","Type":"ContainerStarted","Data":"dbc61a5b29ccd9357542aea2d4b26e58dfe531bda60bb6c5ad9d986171037794"} Mar 14 08:48:10 crc kubenswrapper[4886]: I0314 08:48:10.662344 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"68bf3729-3dcf-4881-814b-b6af3060336e","Type":"ContainerStarted","Data":"70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da"} Mar 14 08:48:11 crc kubenswrapper[4886]: I0314 08:48:11.430547 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a06abf4-5dbb-4bf8-930e-c99893c25410" path="/var/lib/kubelet/pods/0a06abf4-5dbb-4bf8-930e-c99893c25410/volumes" Mar 14 08:48:11 crc kubenswrapper[4886]: I0314 08:48:11.431304 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bda7203-17fd-4894-bb89-b42a36d31466" path="/var/lib/kubelet/pods/4bda7203-17fd-4894-bb89-b42a36d31466/volumes" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.710276 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b","Type":"ContainerStarted","Data":"e36be5ba3e300d10557550ef64688c83e65d6a797ca4fbaff1c61c80874b2716"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.714575 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dafb8d0-a7d6-4426-a52d-19408d20b8b3","Type":"ContainerStarted","Data":"e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.714696 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.716103 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f12a35c8-acb9-4410-9ebe-112b7c51885e","Type":"ContainerStarted","Data":"af273e1921ce38167d19df533b9d3e1172f80d2acb79d34911f396d86960b03a"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.718187 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"014d60c5-6fb9-4259-8f9c-3ff44ff6781c","Type":"ContainerStarted","Data":"ad9255cd148bcab71196461e0d383836bd3954c00d544306db8bf6968d2eb2f6"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.720101 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"211eef94-8537-4eaa-aae0-58b9697c7fac","Type":"ContainerStarted","Data":"0edaff6ffcc4b2814c6b099097a0e5769be691ad63606888ab3e3be291f7b7f9"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.722298 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l" event={"ID":"dee9c638-1703-4b56-b366-13c6746d035c","Type":"ContainerStarted","Data":"e9f4cffc2fb682631dfaeee8717e1b1a475eacaa1e0535fb8af9c7273e1be53a"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.722426 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9k99l" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.724596 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-slbpm" event={"ID":"fa0129a2-aafc-4df4-9376-217bc5b6ee9c","Type":"ContainerStarted","Data":"4d64ecdf54411cd39552fd200dbb9be5d0f13e741b8945e46a1a0f226a4d8704"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.745239 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" event={"ID":"710fbba0-b6e6-4d2e-a3be-f66a11491a0f","Type":"ContainerStarted","Data":"c9b740d8c7aa7c16457126801f3f65871f7a4b14a3002288645d97734857c477"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.752426 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"17dec532-a4ac-466d-9b81-29a3e27c33bb","Type":"ContainerStarted","Data":"b6455689e068e59494bd64ce51432dabd56cf602e0f7ea72d880175331c0a815"} Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.752712 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.764297 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.961440587 podStartE2EDuration="32.764281103s" podCreationTimestamp="2026-03-14 08:47:44 +0000 UTC" firstStartedPulling="2026-03-14 08:48:08.316169035 +0000 UTC m=+1223.564620672" lastFinishedPulling="2026-03-14 08:48:16.119009541 +0000 UTC m=+1231.367461188" observedRunningTime="2026-03-14 08:48:16.759370842 +0000 UTC m=+1232.007822479" watchObservedRunningTime="2026-03-14 08:48:16.764281103 +0000 UTC m=+1232.012732740" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.783029 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9k99l" podStartSLOduration=21.293750506 podStartE2EDuration="28.78301196s" podCreationTimestamp="2026-03-14 08:47:48 +0000 UTC" firstStartedPulling="2026-03-14 08:48:08.563216507 +0000 UTC m=+1223.811668144" lastFinishedPulling="2026-03-14 08:48:16.052477961 +0000 UTC m=+1231.300929598" observedRunningTime="2026-03-14 08:48:16.775747972 +0000 UTC m=+1232.024199619" watchObservedRunningTime="2026-03-14 08:48:16.78301196 +0000 UTC m=+1232.031463597" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.843178 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=27.805538327 podStartE2EDuration="34.843145126s" podCreationTimestamp="2026-03-14 08:47:42 +0000 UTC" firstStartedPulling="2026-03-14 08:48:08.610309789 +0000 UTC m=+1223.858761416" lastFinishedPulling="2026-03-14 08:48:15.647916578 +0000 UTC m=+1230.896368215" observedRunningTime="2026-03-14 08:48:16.842337143 +0000 UTC m=+1232.090788780" watchObservedRunningTime="2026-03-14 08:48:16.843145126 +0000 UTC m=+1232.091596763" Mar 14 08:48:16 crc kubenswrapper[4886]: I0314 08:48:16.869837 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" podStartSLOduration=15.240198944 podStartE2EDuration="16.86979841s" podCreationTimestamp="2026-03-14 08:48:00 +0000 UTC" firstStartedPulling="2026-03-14 08:48:09.123923762 +0000 UTC m=+1224.372375399" lastFinishedPulling="2026-03-14 08:48:10.753523228 +0000 UTC m=+1226.001974865" observedRunningTime="2026-03-14 08:48:16.863417578 +0000 UTC m=+1232.111869225" watchObservedRunningTime="2026-03-14 08:48:16.86979841 +0000 UTC m=+1232.118250047" Mar 14 08:48:17 crc kubenswrapper[4886]: I0314 08:48:17.760482 4886 generic.go:334] "Generic (PLEG): container finished" podID="710fbba0-b6e6-4d2e-a3be-f66a11491a0f" containerID="c9b740d8c7aa7c16457126801f3f65871f7a4b14a3002288645d97734857c477" exitCode=0 Mar 14 08:48:17 crc kubenswrapper[4886]: I0314 08:48:17.760577 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" event={"ID":"710fbba0-b6e6-4d2e-a3be-f66a11491a0f","Type":"ContainerDied","Data":"c9b740d8c7aa7c16457126801f3f65871f7a4b14a3002288645d97734857c477"} Mar 14 08:48:17 crc kubenswrapper[4886]: I0314 08:48:17.763113 4886 generic.go:334] "Generic (PLEG): container finished" podID="fa0129a2-aafc-4df4-9376-217bc5b6ee9c" containerID="4d64ecdf54411cd39552fd200dbb9be5d0f13e741b8945e46a1a0f226a4d8704" exitCode=0 Mar 14 08:48:17 crc kubenswrapper[4886]: I0314 08:48:17.763192 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-slbpm" event={"ID":"fa0129a2-aafc-4df4-9376-217bc5b6ee9c","Type":"ContainerDied","Data":"4d64ecdf54411cd39552fd200dbb9be5d0f13e741b8945e46a1a0f226a4d8704"} Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.731685 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.821807 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerStarted","Data":"29c601a3c13dda1e64fd2afeda8a906573f80921a568fd811185212a87cc6028"} Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.831058 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" event={"ID":"710fbba0-b6e6-4d2e-a3be-f66a11491a0f","Type":"ContainerDied","Data":"14d74df5d86f66528ca67ade689f8d555cd027b49cea294ed21045433e1eca68"} Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.831194 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14d74df5d86f66528ca67ade689f8d555cd027b49cea294ed21045433e1eca68" Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.831144 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557968-7qmjv" Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.857633 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d52ht\" (UniqueName: \"kubernetes.io/projected/710fbba0-b6e6-4d2e-a3be-f66a11491a0f-kube-api-access-d52ht\") pod \"710fbba0-b6e6-4d2e-a3be-f66a11491a0f\" (UID: \"710fbba0-b6e6-4d2e-a3be-f66a11491a0f\") " Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.876628 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710fbba0-b6e6-4d2e-a3be-f66a11491a0f-kube-api-access-d52ht" (OuterVolumeSpecName: "kube-api-access-d52ht") pod "710fbba0-b6e6-4d2e-a3be-f66a11491a0f" (UID: "710fbba0-b6e6-4d2e-a3be-f66a11491a0f"). InnerVolumeSpecName "kube-api-access-d52ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.919342 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 14 08:48:22 crc kubenswrapper[4886]: I0314 08:48:22.960176 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52ht\" (UniqueName: \"kubernetes.io/projected/710fbba0-b6e6-4d2e-a3be-f66a11491a0f-kube-api-access-d52ht\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:23 crc kubenswrapper[4886]: I0314 08:48:23.812001 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-8wx4j"] Mar 14 08:48:23 crc kubenswrapper[4886]: I0314 08:48:23.821501 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557962-8wx4j"] Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.845680 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"014d60c5-6fb9-4259-8f9c-3ff44ff6781c","Type":"ContainerStarted","Data":"02ee942e2757a1995eb4c671cab20d43b501021098071e36cb36ef192ebcdcf3"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.847354 4886 generic.go:334] "Generic (PLEG): container finished" podID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerID="dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60" exitCode=0 Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.847422 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" event={"ID":"103e92b6-e1e8-4a10-8cbd-76d94038132d","Type":"ContainerDied","Data":"dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.849406 4886 generic.go:334] "Generic (PLEG): container finished" podID="0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b" containerID="e36be5ba3e300d10557550ef64688c83e65d6a797ca4fbaff1c61c80874b2716" exitCode=0 Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.849494 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b","Type":"ContainerDied","Data":"e36be5ba3e300d10557550ef64688c83e65d6a797ca4fbaff1c61c80874b2716"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.851555 4886 generic.go:334] "Generic (PLEG): container finished" podID="211eef94-8537-4eaa-aae0-58b9697c7fac" containerID="0edaff6ffcc4b2814c6b099097a0e5769be691ad63606888ab3e3be291f7b7f9" exitCode=0 Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.851640 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"211eef94-8537-4eaa-aae0-58b9697c7fac","Type":"ContainerDied","Data":"0edaff6ffcc4b2814c6b099097a0e5769be691ad63606888ab3e3be291f7b7f9"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.856519 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f12a35c8-acb9-4410-9ebe-112b7c51885e","Type":"ContainerStarted","Data":"3ec8cfbf3089ea8c642549a754008dac32eb929d108ff7471978a21afa7f51b6"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.861891 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-slbpm" event={"ID":"fa0129a2-aafc-4df4-9376-217bc5b6ee9c","Type":"ContainerStarted","Data":"9ab9d0b1fc318a654565fefa5f65b6e6be9e392a5dcfff1b4ad8fa7158983675"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.861955 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-slbpm" event={"ID":"fa0129a2-aafc-4df4-9376-217bc5b6ee9c","Type":"ContainerStarted","Data":"e3e63a909f7acf12da904e704985b55a3bf90887a23326f99b982350e64bddba"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.862861 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.863049 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.865384 4886 generic.go:334] "Generic (PLEG): container finished" podID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerID="69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472" exitCode=0 Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.865508 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" event={"ID":"32d7ee30-d1ca-45c0-a3e7-429463827560","Type":"ContainerDied","Data":"69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472"} Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.879145 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=24.324264622 podStartE2EDuration="37.879099869s" podCreationTimestamp="2026-03-14 08:47:47 +0000 UTC" firstStartedPulling="2026-03-14 08:48:10.132295245 +0000 UTC m=+1225.380746882" lastFinishedPulling="2026-03-14 08:48:23.687130492 +0000 UTC m=+1238.935582129" observedRunningTime="2026-03-14 08:48:24.878553264 +0000 UTC m=+1240.127004911" watchObservedRunningTime="2026-03-14 08:48:24.879099869 +0000 UTC m=+1240.127551506" Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.984071 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.420188717 podStartE2EDuration="33.984042905s" podCreationTimestamp="2026-03-14 08:47:51 +0000 UTC" firstStartedPulling="2026-03-14 08:48:09.125516467 +0000 UTC m=+1224.373968104" lastFinishedPulling="2026-03-14 08:48:23.689370655 +0000 UTC m=+1238.937822292" observedRunningTime="2026-03-14 08:48:24.93814535 +0000 UTC m=+1240.186596987" watchObservedRunningTime="2026-03-14 08:48:24.984042905 +0000 UTC m=+1240.232494542" Mar 14 08:48:24 crc kubenswrapper[4886]: I0314 08:48:24.989895 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-slbpm" podStartSLOduration=30.270251075 podStartE2EDuration="36.989883517s" podCreationTimestamp="2026-03-14 08:47:48 +0000 UTC" firstStartedPulling="2026-03-14 08:48:09.328740491 +0000 UTC m=+1224.577192128" lastFinishedPulling="2026-03-14 08:48:16.048372923 +0000 UTC m=+1231.296824570" observedRunningTime="2026-03-14 08:48:24.975757975 +0000 UTC m=+1240.224209612" watchObservedRunningTime="2026-03-14 08:48:24.989883517 +0000 UTC m=+1240.238335144" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.256878 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.334140 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.376062 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wfhhk"] Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.413097 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-df4tr"] Mar 14 08:48:25 crc kubenswrapper[4886]: E0314 08:48:25.422270 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710fbba0-b6e6-4d2e-a3be-f66a11491a0f" containerName="oc" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.422301 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="710fbba0-b6e6-4d2e-a3be-f66a11491a0f" containerName="oc" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.422553 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="710fbba0-b6e6-4d2e-a3be-f66a11491a0f" containerName="oc" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.425734 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.445056 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4099d30b-7d06-4895-bd7b-8851e9ac38f4" path="/var/lib/kubelet/pods/4099d30b-7d06-4895-bd7b-8851e9ac38f4/volumes" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.457475 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-df4tr"] Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.458899 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.543767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-config\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.544468 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q2nh\" (UniqueName: \"kubernetes.io/projected/2566e042-16c5-4c44-951f-447be9c66f4e-kube-api-access-5q2nh\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.544950 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.610018 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.649554 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.649628 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-config\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.649676 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q2nh\" (UniqueName: \"kubernetes.io/projected/2566e042-16c5-4c44-951f-447be9c66f4e-kube-api-access-5q2nh\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.650941 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.651505 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-config\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.662972 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.715966 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q2nh\" (UniqueName: \"kubernetes.io/projected/2566e042-16c5-4c44-951f-447be9c66f4e-kube-api-access-5q2nh\") pod \"dnsmasq-dns-7cb5889db5-df4tr\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.756846 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.878419 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b","Type":"ContainerStarted","Data":"d2b24dc73f60d32b5fc4cf85085b3d20181b587c8a725a399a3a958b064de60e"} Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.897073 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"211eef94-8537-4eaa-aae0-58b9697c7fac","Type":"ContainerStarted","Data":"0aec108ec3dedfefd62e647fc67642c617daea93c4e1966e1900f133878e265c"} Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.905750 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" event={"ID":"32d7ee30-d1ca-45c0-a3e7-429463827560","Type":"ContainerStarted","Data":"1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403"} Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.905929 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerName="dnsmasq-dns" containerID="cri-o://1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403" gracePeriod=10 Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.906224 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.929519 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" event={"ID":"103e92b6-e1e8-4a10-8cbd-76d94038132d","Type":"ContainerStarted","Data":"a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa"} Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.930152 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.930226 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.930242 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.933173 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=39.476645331 podStartE2EDuration="46.933148705s" podCreationTimestamp="2026-03-14 08:47:39 +0000 UTC" firstStartedPulling="2026-03-14 08:48:08.592881718 +0000 UTC m=+1223.841333345" lastFinishedPulling="2026-03-14 08:48:16.049385082 +0000 UTC m=+1231.297836719" observedRunningTime="2026-03-14 08:48:25.930426179 +0000 UTC m=+1241.178877816" watchObservedRunningTime="2026-03-14 08:48:25.933148705 +0000 UTC m=+1241.181600352" Mar 14 08:48:25 crc kubenswrapper[4886]: I0314 08:48:25.969190 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" podStartSLOduration=3.892599293 podStartE2EDuration="47.969167786s" podCreationTimestamp="2026-03-14 08:47:38 +0000 UTC" firstStartedPulling="2026-03-14 08:47:39.617746559 +0000 UTC m=+1194.866198196" lastFinishedPulling="2026-03-14 08:48:23.694315052 +0000 UTC m=+1238.942766689" observedRunningTime="2026-03-14 08:48:25.966640795 +0000 UTC m=+1241.215092432" watchObservedRunningTime="2026-03-14 08:48:25.969167786 +0000 UTC m=+1241.217619423" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.001633 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.012639 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=36.919609298 podStartE2EDuration="45.012619253s" podCreationTimestamp="2026-03-14 08:47:41 +0000 UTC" firstStartedPulling="2026-03-14 08:48:07.959678402 +0000 UTC m=+1223.208130039" lastFinishedPulling="2026-03-14 08:48:16.052688367 +0000 UTC m=+1231.301139994" observedRunningTime="2026-03-14 08:48:26.011513142 +0000 UTC m=+1241.259964779" watchObservedRunningTime="2026-03-14 08:48:26.012619253 +0000 UTC m=+1241.261070890" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.051887 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.076193 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" podStartSLOduration=4.566726286 podStartE2EDuration="49.076174599s" podCreationTimestamp="2026-03-14 08:47:37 +0000 UTC" firstStartedPulling="2026-03-14 08:47:39.180858058 +0000 UTC m=+1194.429309695" lastFinishedPulling="2026-03-14 08:48:23.690306371 +0000 UTC m=+1238.938758008" observedRunningTime="2026-03-14 08:48:26.072028704 +0000 UTC m=+1241.320480341" watchObservedRunningTime="2026-03-14 08:48:26.076174599 +0000 UTC m=+1241.324626226" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.263810 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wm2x"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.316744 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-tttcp"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.318073 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.327838 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.336595 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-tttcp"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.390254 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtcg5\" (UniqueName: \"kubernetes.io/projected/4ece572b-edc2-46be-adff-dfaed70fadf5-kube-api-access-xtcg5\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.390342 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-config\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.390401 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-dns-svc\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.390501 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.462357 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xt4bm"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.476699 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.482979 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.484904 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xt4bm"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.491956 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b6fc57-0463-4654-8595-09cc5f7f0088-config\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492053 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91b6fc57-0463-4654-8595-09cc5f7f0088-ovn-rundir\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492071 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b6fc57-0463-4654-8595-09cc5f7f0088-combined-ca-bundle\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492099 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtcg5\" (UniqueName: \"kubernetes.io/projected/4ece572b-edc2-46be-adff-dfaed70fadf5-kube-api-access-xtcg5\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492136 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-config\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492168 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-dns-svc\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492213 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91b6fc57-0463-4654-8595-09cc5f7f0088-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492242 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492290 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxvhv\" (UniqueName: \"kubernetes.io/projected/91b6fc57-0463-4654-8595-09cc5f7f0088-kube-api-access-cxvhv\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.492319 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91b6fc57-0463-4654-8595-09cc5f7f0088-ovs-rundir\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.497252 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-config\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.498888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-dns-svc\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.504478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.505515 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-df4tr"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.524634 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtcg5\" (UniqueName: \"kubernetes.io/projected/4ece572b-edc2-46be-adff-dfaed70fadf5-kube-api-access-xtcg5\") pod \"dnsmasq-dns-57d65f699f-tttcp\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.529248 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.560659 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.560809 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.564084 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-df4tr"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.564707 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hjfc6" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.564910 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.565024 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.580475 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.582155 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.587620 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.592588 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kvf2g" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.592750 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.592851 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596590 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56f90ac1-34d1-4101-b5cc-37e76200d22a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596635 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f90ac1-34d1-4101-b5cc-37e76200d22a-config\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596677 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxvhv\" (UniqueName: \"kubernetes.io/projected/91b6fc57-0463-4654-8595-09cc5f7f0088-kube-api-access-cxvhv\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596705 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91b6fc57-0463-4654-8595-09cc5f7f0088-ovs-rundir\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596825 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596867 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56f90ac1-34d1-4101-b5cc-37e76200d22a-scripts\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596899 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596929 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b6fc57-0463-4654-8595-09cc5f7f0088-config\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.596976 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.597009 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcg7\" (UniqueName: \"kubernetes.io/projected/56f90ac1-34d1-4101-b5cc-37e76200d22a-kube-api-access-9qcg7\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.597065 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91b6fc57-0463-4654-8595-09cc5f7f0088-ovn-rundir\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.597092 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b6fc57-0463-4654-8595-09cc5f7f0088-combined-ca-bundle\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.597493 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/91b6fc57-0463-4654-8595-09cc5f7f0088-ovs-rundir\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.599763 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91b6fc57-0463-4654-8595-09cc5f7f0088-config\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.599817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/91b6fc57-0463-4654-8595-09cc5f7f0088-ovn-rundir\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.600040 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91b6fc57-0463-4654-8595-09cc5f7f0088-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.601707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b6fc57-0463-4654-8595-09cc5f7f0088-combined-ca-bundle\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.605542 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.607483 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.610029 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91b6fc57-0463-4654-8595-09cc5f7f0088-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.617285 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-qhcn4"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.619450 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.625799 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-qhcn4"] Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.626498 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.643844 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxvhv\" (UniqueName: \"kubernetes.io/projected/91b6fc57-0463-4654-8595-09cc5f7f0088-kube-api-access-cxvhv\") pod \"ovn-controller-metrics-xt4bm\" (UID: \"91b6fc57-0463-4654-8595-09cc5f7f0088\") " pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.646008 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712562 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc98g\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-kube-api-access-tc98g\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712626 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712703 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13c5527-179a-440c-bca1-379cab773854-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56f90ac1-34d1-4101-b5cc-37e76200d22a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712776 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f90ac1-34d1-4101-b5cc-37e76200d22a-config\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712810 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b13c5527-179a-440c-bca1-379cab773854-lock\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712852 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712881 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56f90ac1-34d1-4101-b5cc-37e76200d22a-scripts\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712916 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712936 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b13c5527-179a-440c-bca1-379cab773854-cache\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.712963 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.713013 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.713041 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcg7\" (UniqueName: \"kubernetes.io/projected/56f90ac1-34d1-4101-b5cc-37e76200d22a-kube-api-access-9qcg7\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.715855 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56f90ac1-34d1-4101-b5cc-37e76200d22a-scripts\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.716014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56f90ac1-34d1-4101-b5cc-37e76200d22a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.716709 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f90ac1-34d1-4101-b5cc-37e76200d22a-config\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.720888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.720995 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.727094 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56f90ac1-34d1-4101-b5cc-37e76200d22a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.739958 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcg7\" (UniqueName: \"kubernetes.io/projected/56f90ac1-34d1-4101-b5cc-37e76200d22a-kube-api-access-9qcg7\") pod \"ovn-northd-0\" (UID: \"56f90ac1-34d1-4101-b5cc-37e76200d22a\") " pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.813162 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xt4bm" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814097 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b13c5527-179a-440c-bca1-379cab773854-lock\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814158 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814184 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814235 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789k4\" (UniqueName: \"kubernetes.io/projected/a1701819-f673-4757-b2f4-6a3dd4da8601-kube-api-access-789k4\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814282 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814306 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b13c5527-179a-440c-bca1-379cab773854-cache\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814370 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc98g\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-kube-api-access-tc98g\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.814395 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.815364 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.815451 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b13c5527-179a-440c-bca1-379cab773854-cache\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.815462 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b13c5527-179a-440c-bca1-379cab773854-lock\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.815507 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-config\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.815536 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13c5527-179a-440c-bca1-379cab773854-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.815618 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: E0314 08:48:26.815538 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 08:48:26 crc kubenswrapper[4886]: E0314 08:48:26.815708 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 08:48:26 crc kubenswrapper[4886]: E0314 08:48:26.815790 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift podName:b13c5527-179a-440c-bca1-379cab773854 nodeName:}" failed. No retries permitted until 2026-03-14 08:48:27.315771128 +0000 UTC m=+1242.564222765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift") pod "swift-storage-0" (UID: "b13c5527-179a-440c-bca1-379cab773854") : configmap "swift-ring-files" not found Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.821406 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b13c5527-179a-440c-bca1-379cab773854-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.841430 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc98g\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-kube-api-access-tc98g\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.881877 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.919197 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.919239 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.919283 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-789k4\" (UniqueName: \"kubernetes.io/projected/a1701819-f673-4757-b2f4-6a3dd4da8601-kube-api-access-789k4\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.919439 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-config\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.919480 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.920249 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.920334 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-config\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.920453 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.920980 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.942607 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.943481 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-789k4\" (UniqueName: \"kubernetes.io/projected/a1701819-f673-4757-b2f4-6a3dd4da8601-kube-api-access-789k4\") pod \"dnsmasq-dns-b8fbc5445-qhcn4\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.960305 4886 generic.go:334] "Generic (PLEG): container finished" podID="2566e042-16c5-4c44-951f-447be9c66f4e" containerID="b7d54eeae704f314df30733be5759c60f40f8ae85c29b4003b4cfcfe326f64c0" exitCode=0 Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.960396 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" event={"ID":"2566e042-16c5-4c44-951f-447be9c66f4e","Type":"ContainerDied","Data":"b7d54eeae704f314df30733be5759c60f40f8ae85c29b4003b4cfcfe326f64c0"} Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.960442 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" event={"ID":"2566e042-16c5-4c44-951f-447be9c66f4e","Type":"ContainerStarted","Data":"94eae3de4b8052716a92f908232eeef2b77fbeaf24ef9fa89d5aaabfd3a06035"} Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.968436 4886 generic.go:334] "Generic (PLEG): container finished" podID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerID="1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403" exitCode=0 Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.969590 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.969746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" event={"ID":"32d7ee30-d1ca-45c0-a3e7-429463827560","Type":"ContainerDied","Data":"1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403"} Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.969774 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wfhhk" event={"ID":"32d7ee30-d1ca-45c0-a3e7-429463827560","Type":"ContainerDied","Data":"a980fc4846653ea2e0cdd5afdb770eff49e4fb8dbcbf2b60081cd5f50e106f97"} Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.969795 4886 scope.go:117] "RemoveContainer" containerID="1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.971999 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 14 08:48:26 crc kubenswrapper[4886]: I0314 08:48:26.978407 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.034101 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tmfmz"] Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.034718 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerName="init" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.034742 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerName="init" Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.034768 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerName="dnsmasq-dns" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.034777 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerName="dnsmasq-dns" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.035022 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" containerName="dnsmasq-dns" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.035981 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.043879 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.044060 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.044226 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.126446 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pj78\" (UniqueName: \"kubernetes.io/projected/32d7ee30-d1ca-45c0-a3e7-429463827560-kube-api-access-2pj78\") pod \"32d7ee30-d1ca-45c0-a3e7-429463827560\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.126538 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-dns-svc\") pod \"32d7ee30-d1ca-45c0-a3e7-429463827560\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.126574 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-config\") pod \"32d7ee30-d1ca-45c0-a3e7-429463827560\" (UID: \"32d7ee30-d1ca-45c0-a3e7-429463827560\") " Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.126885 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef47526-1832-41c2-9359-e407c4add215-etc-swift\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.126934 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-dispersionconf\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.127045 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-ring-data-devices\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.127068 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-swiftconf\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.127132 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-combined-ca-bundle\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.127170 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwcp2\" (UniqueName: \"kubernetes.io/projected/3ef47526-1832-41c2-9359-e407c4add215-kube-api-access-bwcp2\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.127260 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-scripts\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.132287 4886 scope.go:117] "RemoveContainer" containerID="69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.136248 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d7ee30-d1ca-45c0-a3e7-429463827560-kube-api-access-2pj78" (OuterVolumeSpecName: "kube-api-access-2pj78") pod "32d7ee30-d1ca-45c0-a3e7-429463827560" (UID: "32d7ee30-d1ca-45c0-a3e7-429463827560"). InnerVolumeSpecName "kube-api-access-2pj78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.137665 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-tmfmz"] Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.138335 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-bwcp2 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-tmfmz" podUID="3ef47526-1832-41c2-9359-e407c4add215" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.171188 4886 scope.go:117] "RemoveContainer" containerID="1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403" Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.171965 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403\": container with ID starting with 1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403 not found: ID does not exist" containerID="1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.172072 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403"} err="failed to get container status \"1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403\": rpc error: code = NotFound desc = could not find container \"1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403\": container with ID starting with 1f4ca0df929152e44ea4fa5f367f9290d38ae7af09ea82780f1f47c4a65da403 not found: ID does not exist" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.172141 4886 scope.go:117] "RemoveContainer" containerID="69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472" Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.173465 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472\": container with ID starting with 69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472 not found: ID does not exist" containerID="69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.173532 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472"} err="failed to get container status \"69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472\": rpc error: code = NotFound desc = could not find container \"69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472\": container with ID starting with 69c6da4326d82b91b3435d65a86dea2af424d4465906195dd2032d147f731472 not found: ID does not exist" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.195041 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32d7ee30-d1ca-45c0-a3e7-429463827560" (UID: "32d7ee30-d1ca-45c0-a3e7-429463827560"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.211617 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-config" (OuterVolumeSpecName: "config") pod "32d7ee30-d1ca-45c0-a3e7-429463827560" (UID: "32d7ee30-d1ca-45c0-a3e7-429463827560"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.214321 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-f4mxt"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.215402 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: W0314 08:48:27.223798 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ece572b_edc2_46be_adff_dfaed70fadf5.slice/crio-8acebf0ac5c4e8b5dede9e2cbcbbd9f121720f4a01169f3eef5803553c6a8ac0 WatchSource:0}: Error finding container 8acebf0ac5c4e8b5dede9e2cbcbbd9f121720f4a01169f3eef5803553c6a8ac0: Status 404 returned error can't find the container with id 8acebf0ac5c4e8b5dede9e2cbcbbd9f121720f4a01169f3eef5803553c6a8ac0 Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229539 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwcp2\" (UniqueName: \"kubernetes.io/projected/3ef47526-1832-41c2-9359-e407c4add215-kube-api-access-bwcp2\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229611 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-scripts\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229655 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef47526-1832-41c2-9359-e407c4add215-etc-swift\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229678 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-dispersionconf\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229740 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-ring-data-devices\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229758 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-swiftconf\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229788 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-combined-ca-bundle\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229831 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pj78\" (UniqueName: \"kubernetes.io/projected/32d7ee30-d1ca-45c0-a3e7-429463827560-kube-api-access-2pj78\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229843 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.229852 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d7ee30-d1ca-45c0-a3e7-429463827560-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.230770 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef47526-1832-41c2-9359-e407c4add215-etc-swift\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.231424 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-ring-data-devices\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.231723 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-scripts\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.232175 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f4mxt"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.237606 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-combined-ca-bundle\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.266383 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-swiftconf\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.266622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-dispersionconf\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.272878 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-tmfmz"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.275804 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwcp2\" (UniqueName: \"kubernetes.io/projected/3ef47526-1832-41c2-9359-e407c4add215-kube-api-access-bwcp2\") pod \"swift-ring-rebalance-tmfmz\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333543 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79bm\" (UniqueName: \"kubernetes.io/projected/8658a67d-fe04-40af-a495-bb3d50c9a9db-kube-api-access-g79bm\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333633 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333668 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-ring-data-devices\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333686 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-combined-ca-bundle\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333709 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-scripts\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333736 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8658a67d-fe04-40af-a495-bb3d50c9a9db-etc-swift\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333933 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-dispersionconf\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.333955 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-swiftconf\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.334498 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.334527 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 08:48:27 crc kubenswrapper[4886]: E0314 08:48:27.334580 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift podName:b13c5527-179a-440c-bca1-379cab773854 nodeName:}" failed. No retries permitted until 2026-03-14 08:48:28.334557502 +0000 UTC m=+1243.583009139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift") pod "swift-storage-0" (UID: "b13c5527-179a-440c-bca1-379cab773854") : configmap "swift-ring-files" not found Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.359432 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-tttcp"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.404588 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wfhhk"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.411195 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wfhhk"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.419399 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xt4bm"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436000 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-scripts\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436357 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8658a67d-fe04-40af-a495-bb3d50c9a9db-etc-swift\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436421 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d7ee30-d1ca-45c0-a3e7-429463827560" path="/var/lib/kubelet/pods/32d7ee30-d1ca-45c0-a3e7-429463827560/volumes" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436567 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-dispersionconf\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436644 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-swiftconf\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436814 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g79bm\" (UniqueName: \"kubernetes.io/projected/8658a67d-fe04-40af-a495-bb3d50c9a9db-kube-api-access-g79bm\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436892 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-ring-data-devices\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436910 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-combined-ca-bundle\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.436880 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-scripts\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.438743 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-ring-data-devices\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.439524 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8658a67d-fe04-40af-a495-bb3d50c9a9db-etc-swift\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.442664 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-combined-ca-bundle\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.443222 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-swiftconf\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.443475 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-dispersionconf\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.469876 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79bm\" (UniqueName: \"kubernetes.io/projected/8658a67d-fe04-40af-a495-bb3d50c9a9db-kube-api-access-g79bm\") pod \"swift-ring-rebalance-f4mxt\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.510805 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.538341 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q2nh\" (UniqueName: \"kubernetes.io/projected/2566e042-16c5-4c44-951f-447be9c66f4e-kube-api-access-5q2nh\") pod \"2566e042-16c5-4c44-951f-447be9c66f4e\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.538404 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-config\") pod \"2566e042-16c5-4c44-951f-447be9c66f4e\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.538553 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-dns-svc\") pod \"2566e042-16c5-4c44-951f-447be9c66f4e\" (UID: \"2566e042-16c5-4c44-951f-447be9c66f4e\") " Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.543303 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2566e042-16c5-4c44-951f-447be9c66f4e-kube-api-access-5q2nh" (OuterVolumeSpecName: "kube-api-access-5q2nh") pod "2566e042-16c5-4c44-951f-447be9c66f4e" (UID: "2566e042-16c5-4c44-951f-447be9c66f4e"). InnerVolumeSpecName "kube-api-access-5q2nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.565197 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-config" (OuterVolumeSpecName: "config") pod "2566e042-16c5-4c44-951f-447be9c66f4e" (UID: "2566e042-16c5-4c44-951f-447be9c66f4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.567893 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2566e042-16c5-4c44-951f-447be9c66f4e" (UID: "2566e042-16c5-4c44-951f-447be9c66f4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.571051 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.641323 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q2nh\" (UniqueName: \"kubernetes.io/projected/2566e042-16c5-4c44-951f-447be9c66f4e-kube-api-access-5q2nh\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.641359 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.641369 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2566e042-16c5-4c44-951f-447be9c66f4e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.793287 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-qhcn4"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.884818 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.983454 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerID="c6ea87b541903a94e42f2b927667de86dfdecb56a2c19654a7c040f7340e75d4" exitCode=0 Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.983569 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" event={"ID":"4ece572b-edc2-46be-adff-dfaed70fadf5","Type":"ContainerDied","Data":"c6ea87b541903a94e42f2b927667de86dfdecb56a2c19654a7c040f7340e75d4"} Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.983648 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" event={"ID":"4ece572b-edc2-46be-adff-dfaed70fadf5","Type":"ContainerStarted","Data":"8acebf0ac5c4e8b5dede9e2cbcbbd9f121720f4a01169f3eef5803553c6a8ac0"} Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.985300 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56f90ac1-34d1-4101-b5cc-37e76200d22a","Type":"ContainerStarted","Data":"73d76ffe0b63775ca01edef65240d300c6c2b4791105d01023a8d131aa6a9c08"} Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.990911 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" event={"ID":"a1701819-f673-4757-b2f4-6a3dd4da8601","Type":"ContainerStarted","Data":"d17ace1c14e706c11391930423edd6a53bc64690c25b53eaa651ae1b4c28d9c2"} Mar 14 08:48:27 crc kubenswrapper[4886]: I0314 08:48:27.995959 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xt4bm" event={"ID":"91b6fc57-0463-4654-8595-09cc5f7f0088","Type":"ContainerStarted","Data":"bdfe72ad5f902e129bfd80598e8ee6e7afd833dfe8c7f81174b08432cf07b4d9"} Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.003967 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" event={"ID":"2566e042-16c5-4c44-951f-447be9c66f4e","Type":"ContainerDied","Data":"94eae3de4b8052716a92f908232eeef2b77fbeaf24ef9fa89d5aaabfd3a06035"} Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.004061 4886 scope.go:117] "RemoveContainer" containerID="b7d54eeae704f314df30733be5759c60f40f8ae85c29b4003b4cfcfe326f64c0" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.004111 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.004133 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-df4tr" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.005137 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerName="dnsmasq-dns" containerID="cri-o://a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa" gracePeriod=10 Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.056571 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f4mxt"] Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.110492 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.157807 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-scripts\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.157856 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-swiftconf\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.157891 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-ring-data-devices\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.157981 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-dispersionconf\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.158440 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-scripts" (OuterVolumeSpecName: "scripts") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.158540 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.158726 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-combined-ca-bundle\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.158766 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwcp2\" (UniqueName: \"kubernetes.io/projected/3ef47526-1832-41c2-9359-e407c4add215-kube-api-access-bwcp2\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.158825 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef47526-1832-41c2-9359-e407c4add215-etc-swift\") pod \"3ef47526-1832-41c2-9359-e407c4add215\" (UID: \"3ef47526-1832-41c2-9359-e407c4add215\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.159292 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.159325 4886 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3ef47526-1832-41c2-9359-e407c4add215-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.159867 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef47526-1832-41c2-9359-e407c4add215-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.162962 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.164951 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef47526-1832-41c2-9359-e407c4add215-kube-api-access-bwcp2" (OuterVolumeSpecName: "kube-api-access-bwcp2") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "kube-api-access-bwcp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.165508 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.166783 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3ef47526-1832-41c2-9359-e407c4add215" (UID: "3ef47526-1832-41c2-9359-e407c4add215"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.258402 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-df4tr"] Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.263444 4886 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.263476 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.263487 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwcp2\" (UniqueName: \"kubernetes.io/projected/3ef47526-1832-41c2-9359-e407c4add215-kube-api-access-bwcp2\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.263500 4886 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3ef47526-1832-41c2-9359-e407c4add215-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.263509 4886 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3ef47526-1832-41c2-9359-e407c4add215-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.269791 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-df4tr"] Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.365320 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:28 crc kubenswrapper[4886]: E0314 08:48:28.365617 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 08:48:28 crc kubenswrapper[4886]: E0314 08:48:28.365661 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 08:48:28 crc kubenswrapper[4886]: E0314 08:48:28.365743 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift podName:b13c5527-179a-440c-bca1-379cab773854 nodeName:}" failed. No retries permitted until 2026-03-14 08:48:30.36571546 +0000 UTC m=+1245.614167097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift") pod "swift-storage-0" (UID: "b13c5527-179a-440c-bca1-379cab773854") : configmap "swift-ring-files" not found Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.527520 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.568001 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-config\") pod \"103e92b6-e1e8-4a10-8cbd-76d94038132d\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.568072 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-dns-svc\") pod \"103e92b6-e1e8-4a10-8cbd-76d94038132d\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.568189 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjlwl\" (UniqueName: \"kubernetes.io/projected/103e92b6-e1e8-4a10-8cbd-76d94038132d-kube-api-access-fjlwl\") pod \"103e92b6-e1e8-4a10-8cbd-76d94038132d\" (UID: \"103e92b6-e1e8-4a10-8cbd-76d94038132d\") " Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.575615 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103e92b6-e1e8-4a10-8cbd-76d94038132d-kube-api-access-fjlwl" (OuterVolumeSpecName: "kube-api-access-fjlwl") pod "103e92b6-e1e8-4a10-8cbd-76d94038132d" (UID: "103e92b6-e1e8-4a10-8cbd-76d94038132d"). InnerVolumeSpecName "kube-api-access-fjlwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.613256 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "103e92b6-e1e8-4a10-8cbd-76d94038132d" (UID: "103e92b6-e1e8-4a10-8cbd-76d94038132d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.614869 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-config" (OuterVolumeSpecName: "config") pod "103e92b6-e1e8-4a10-8cbd-76d94038132d" (UID: "103e92b6-e1e8-4a10-8cbd-76d94038132d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.669773 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.669804 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/103e92b6-e1e8-4a10-8cbd-76d94038132d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:28 crc kubenswrapper[4886]: I0314 08:48:28.669816 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjlwl\" (UniqueName: \"kubernetes.io/projected/103e92b6-e1e8-4a10-8cbd-76d94038132d-kube-api-access-fjlwl\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.015081 4886 generic.go:334] "Generic (PLEG): container finished" podID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerID="a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa" exitCode=0 Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.015149 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" event={"ID":"103e92b6-e1e8-4a10-8cbd-76d94038132d","Type":"ContainerDied","Data":"a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa"} Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.015211 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.015538 4886 scope.go:117] "RemoveContainer" containerID="a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.015498 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8wm2x" event={"ID":"103e92b6-e1e8-4a10-8cbd-76d94038132d","Type":"ContainerDied","Data":"47ae459730dcebcbc22e34f6f97288415c2cf85e365e13692771930e36537556"} Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.018083 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerID="557aacaff1b2b6a25e5b9b1cb78997e77ac13a2ce1f69a6ae6536716e5a8ddc6" exitCode=0 Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.018142 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" event={"ID":"a1701819-f673-4757-b2f4-6a3dd4da8601","Type":"ContainerDied","Data":"557aacaff1b2b6a25e5b9b1cb78997e77ac13a2ce1f69a6ae6536716e5a8ddc6"} Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.022662 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xt4bm" event={"ID":"91b6fc57-0463-4654-8595-09cc5f7f0088","Type":"ContainerStarted","Data":"7d49a87da63d20c7c3987769778b35315b30b7206514192ed98438bc6150af64"} Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.029974 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" event={"ID":"4ece572b-edc2-46be-adff-dfaed70fadf5","Type":"ContainerStarted","Data":"612e2d9ebcc0f8777b4898a5a31251d0b1fc3b42a4b38af0f41b8ef32e96a060"} Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.030145 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.031620 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tmfmz" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.032355 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4mxt" event={"ID":"8658a67d-fe04-40af-a495-bb3d50c9a9db","Type":"ContainerStarted","Data":"a6db0da28bf0d883b60088ac74a52c26376ed503960ef560774c2f2527f27ecc"} Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.071633 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" podStartSLOduration=3.071612013 podStartE2EDuration="3.071612013s" podCreationTimestamp="2026-03-14 08:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:29.063279071 +0000 UTC m=+1244.311730708" watchObservedRunningTime="2026-03-14 08:48:29.071612013 +0000 UTC m=+1244.320063650" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.108167 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xt4bm" podStartSLOduration=3.108112127 podStartE2EDuration="3.108112127s" podCreationTimestamp="2026-03-14 08:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:29.083633687 +0000 UTC m=+1244.332085314" watchObservedRunningTime="2026-03-14 08:48:29.108112127 +0000 UTC m=+1244.356563754" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.147300 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wm2x"] Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.160422 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8wm2x"] Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.218359 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-tmfmz"] Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.247872 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-tmfmz"] Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.441103 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" path="/var/lib/kubelet/pods/103e92b6-e1e8-4a10-8cbd-76d94038132d/volumes" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.441932 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2566e042-16c5-4c44-951f-447be9c66f4e" path="/var/lib/kubelet/pods/2566e042-16c5-4c44-951f-447be9c66f4e/volumes" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.442518 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef47526-1832-41c2-9359-e407c4add215" path="/var/lib/kubelet/pods/3ef47526-1832-41c2-9359-e407c4add215/volumes" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.734641 4886 scope.go:117] "RemoveContainer" containerID="dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.780499 4886 scope.go:117] "RemoveContainer" containerID="a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa" Mar 14 08:48:29 crc kubenswrapper[4886]: E0314 08:48:29.782388 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa\": container with ID starting with a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa not found: ID does not exist" containerID="a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.782469 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa"} err="failed to get container status \"a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa\": rpc error: code = NotFound desc = could not find container \"a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa\": container with ID starting with a14ffcacca9ba15650742b1d694ac0481f6a44f776f64be79e7fb8d2f268e7fa not found: ID does not exist" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.782506 4886 scope.go:117] "RemoveContainer" containerID="dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60" Mar 14 08:48:29 crc kubenswrapper[4886]: E0314 08:48:29.783560 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60\": container with ID starting with dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60 not found: ID does not exist" containerID="dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60" Mar 14 08:48:29 crc kubenswrapper[4886]: I0314 08:48:29.783590 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60"} err="failed to get container status \"dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60\": rpc error: code = NotFound desc = could not find container \"dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60\": container with ID starting with dd1aa0db48c1967bae37abc1505fe11a7b7d1a6adcb20727fe527ba84331cd60 not found: ID does not exist" Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.046792 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" event={"ID":"a1701819-f673-4757-b2f4-6a3dd4da8601","Type":"ContainerStarted","Data":"c980931d3e428853308a510ad169da2482c95c527ce85917c4411fe99c4009d4"} Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.049427 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.054710 4886 generic.go:334] "Generic (PLEG): container finished" podID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerID="29c601a3c13dda1e64fd2afeda8a906573f80921a568fd811185212a87cc6028" exitCode=0 Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.054780 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerDied","Data":"29c601a3c13dda1e64fd2afeda8a906573f80921a568fd811185212a87cc6028"} Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.058810 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56f90ac1-34d1-4101-b5cc-37e76200d22a","Type":"ContainerStarted","Data":"2b6fd3c1e954c2c6d57dd9ed0d35737d0a97d1085a05d1dd825639920c05febe"} Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.098316 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" podStartSLOduration=4.098268658 podStartE2EDuration="4.098268658s" podCreationTimestamp="2026-03-14 08:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:30.074517068 +0000 UTC m=+1245.322968705" watchObservedRunningTime="2026-03-14 08:48:30.098268658 +0000 UTC m=+1245.346720295" Mar 14 08:48:30 crc kubenswrapper[4886]: I0314 08:48:30.431249 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:30 crc kubenswrapper[4886]: E0314 08:48:30.431468 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 08:48:30 crc kubenswrapper[4886]: E0314 08:48:30.431498 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 08:48:30 crc kubenswrapper[4886]: E0314 08:48:30.431584 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift podName:b13c5527-179a-440c-bca1-379cab773854 nodeName:}" failed. No retries permitted until 2026-03-14 08:48:34.431563278 +0000 UTC m=+1249.680014915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift") pod "swift-storage-0" (UID: "b13c5527-179a-440c-bca1-379cab773854") : configmap "swift-ring-files" not found Mar 14 08:48:31 crc kubenswrapper[4886]: I0314 08:48:31.423356 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 14 08:48:31 crc kubenswrapper[4886]: I0314 08:48:31.445646 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 14 08:48:32 crc kubenswrapper[4886]: I0314 08:48:32.866298 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 14 08:48:32 crc kubenswrapper[4886]: I0314 08:48:32.866847 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 14 08:48:32 crc kubenswrapper[4886]: I0314 08:48:32.960749 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 14 08:48:33 crc kubenswrapper[4886]: I0314 08:48:33.107129 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56f90ac1-34d1-4101-b5cc-37e76200d22a","Type":"ContainerStarted","Data":"1dd9df81c79a8b0ffcd102b65e3ac7e3d1068296c0762910a2fb167615bcff2e"} Mar 14 08:48:33 crc kubenswrapper[4886]: I0314 08:48:33.107471 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 14 08:48:33 crc kubenswrapper[4886]: I0314 08:48:33.142344 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.242260752 podStartE2EDuration="7.142325703s" podCreationTimestamp="2026-03-14 08:48:26 +0000 UTC" firstStartedPulling="2026-03-14 08:48:27.881465186 +0000 UTC m=+1243.129916823" lastFinishedPulling="2026-03-14 08:48:29.781530127 +0000 UTC m=+1245.029981774" observedRunningTime="2026-03-14 08:48:33.134761283 +0000 UTC m=+1248.383212920" watchObservedRunningTime="2026-03-14 08:48:33.142325703 +0000 UTC m=+1248.390777340" Mar 14 08:48:33 crc kubenswrapper[4886]: I0314 08:48:33.227561 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 14 08:48:33 crc kubenswrapper[4886]: I0314 08:48:33.901099 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 14 08:48:33 crc kubenswrapper[4886]: I0314 08:48:33.993317 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.121031 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4mxt" event={"ID":"8658a67d-fe04-40af-a495-bb3d50c9a9db","Type":"ContainerStarted","Data":"d9f9b1521c81cf83768d6cf6143d704fd1e9331a6b1b6c66ed397f33d459e912"} Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.144565 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-f4mxt" podStartSLOduration=2.131319442 podStartE2EDuration="7.144542378s" podCreationTimestamp="2026-03-14 08:48:27 +0000 UTC" firstStartedPulling="2026-03-14 08:48:28.140920815 +0000 UTC m=+1243.389372442" lastFinishedPulling="2026-03-14 08:48:33.154143731 +0000 UTC m=+1248.402595378" observedRunningTime="2026-03-14 08:48:34.136905606 +0000 UTC m=+1249.385357243" watchObservedRunningTime="2026-03-14 08:48:34.144542378 +0000 UTC m=+1249.392994025" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.170191 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2ed1-account-create-update-bfzn5"] Mar 14 08:48:34 crc kubenswrapper[4886]: E0314 08:48:34.170511 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerName="init" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.170528 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerName="init" Mar 14 08:48:34 crc kubenswrapper[4886]: E0314 08:48:34.170547 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2566e042-16c5-4c44-951f-447be9c66f4e" containerName="init" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.170554 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2566e042-16c5-4c44-951f-447be9c66f4e" containerName="init" Mar 14 08:48:34 crc kubenswrapper[4886]: E0314 08:48:34.170571 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerName="dnsmasq-dns" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.170576 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerName="dnsmasq-dns" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.170714 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2566e042-16c5-4c44-951f-447be9c66f4e" containerName="init" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.170738 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="103e92b6-e1e8-4a10-8cbd-76d94038132d" containerName="dnsmasq-dns" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.171245 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.175681 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.183953 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2ed1-account-create-update-bfzn5"] Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.228667 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hb7vx"] Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.230557 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.235744 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dffl\" (UniqueName: \"kubernetes.io/projected/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-kube-api-access-9dffl\") pod \"placement-2ed1-account-create-update-bfzn5\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.235983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-operator-scripts\") pod \"placement-2ed1-account-create-update-bfzn5\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.250800 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hb7vx"] Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.337317 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-operator-scripts\") pod \"placement-2ed1-account-create-update-bfzn5\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.337389 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94kr6\" (UniqueName: \"kubernetes.io/projected/fd659431-d053-4d3b-a7a8-7a9b20438242-kube-api-access-94kr6\") pod \"placement-db-create-hb7vx\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.337431 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dffl\" (UniqueName: \"kubernetes.io/projected/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-kube-api-access-9dffl\") pod \"placement-2ed1-account-create-update-bfzn5\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.337452 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd659431-d053-4d3b-a7a8-7a9b20438242-operator-scripts\") pod \"placement-db-create-hb7vx\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.338086 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-operator-scripts\") pod \"placement-2ed1-account-create-update-bfzn5\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.362622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dffl\" (UniqueName: \"kubernetes.io/projected/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-kube-api-access-9dffl\") pod \"placement-2ed1-account-create-update-bfzn5\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.438885 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94kr6\" (UniqueName: \"kubernetes.io/projected/fd659431-d053-4d3b-a7a8-7a9b20438242-kube-api-access-94kr6\") pod \"placement-db-create-hb7vx\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.438954 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd659431-d053-4d3b-a7a8-7a9b20438242-operator-scripts\") pod \"placement-db-create-hb7vx\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.438986 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:34 crc kubenswrapper[4886]: E0314 08:48:34.439415 4886 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 14 08:48:34 crc kubenswrapper[4886]: E0314 08:48:34.439433 4886 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 14 08:48:34 crc kubenswrapper[4886]: E0314 08:48:34.439481 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift podName:b13c5527-179a-440c-bca1-379cab773854 nodeName:}" failed. No retries permitted until 2026-03-14 08:48:42.439463502 +0000 UTC m=+1257.687915139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift") pod "swift-storage-0" (UID: "b13c5527-179a-440c-bca1-379cab773854") : configmap "swift-ring-files" not found Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.440159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd659431-d053-4d3b-a7a8-7a9b20438242-operator-scripts\") pod \"placement-db-create-hb7vx\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.454136 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94kr6\" (UniqueName: \"kubernetes.io/projected/fd659431-d053-4d3b-a7a8-7a9b20438242-kube-api-access-94kr6\") pod \"placement-db-create-hb7vx\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.509852 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:34 crc kubenswrapper[4886]: I0314 08:48:34.562715 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.077086 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2ed1-account-create-update-bfzn5"] Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.109300 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hb7vx"] Mar 14 08:48:35 crc kubenswrapper[4886]: W0314 08:48:35.121863 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd659431_d053_4d3b_a7a8_7a9b20438242.slice/crio-b05390b8be616204a32f0fe736695f7aaac95b3cc5823c354c42ad665d7c2ece WatchSource:0}: Error finding container b05390b8be616204a32f0fe736695f7aaac95b3cc5823c354c42ad665d7c2ece: Status 404 returned error can't find the container with id b05390b8be616204a32f0fe736695f7aaac95b3cc5823c354c42ad665d7c2ece Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.146923 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2ed1-account-create-update-bfzn5" event={"ID":"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba","Type":"ContainerStarted","Data":"38eef85a482c0f76bbd3bdde380be270d45dbd24156d22934f35ebc10de91f3c"} Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.270134 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-5lbzf"] Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.272881 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.300685 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-5lbzf"] Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.368518 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7n6x\" (UniqueName: \"kubernetes.io/projected/67508ffc-b9b5-4172-b4fc-f6870f5f210a-kube-api-access-l7n6x\") pod \"watcher-db-create-5lbzf\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.369094 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67508ffc-b9b5-4172-b4fc-f6870f5f210a-operator-scripts\") pod \"watcher-db-create-5lbzf\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.374404 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-b1fc-account-create-update-87rmn"] Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.376279 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.379512 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.397785 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-b1fc-account-create-update-87rmn"] Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.471414 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67508ffc-b9b5-4172-b4fc-f6870f5f210a-operator-scripts\") pod \"watcher-db-create-5lbzf\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.471872 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74857ae3-5a84-4d38-9962-1836e71789da-operator-scripts\") pod \"watcher-b1fc-account-create-update-87rmn\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.472064 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwmvx\" (UniqueName: \"kubernetes.io/projected/74857ae3-5a84-4d38-9962-1836e71789da-kube-api-access-bwmvx\") pod \"watcher-b1fc-account-create-update-87rmn\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.472236 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7n6x\" (UniqueName: \"kubernetes.io/projected/67508ffc-b9b5-4172-b4fc-f6870f5f210a-kube-api-access-l7n6x\") pod \"watcher-db-create-5lbzf\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.472430 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67508ffc-b9b5-4172-b4fc-f6870f5f210a-operator-scripts\") pod \"watcher-db-create-5lbzf\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.494608 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7n6x\" (UniqueName: \"kubernetes.io/projected/67508ffc-b9b5-4172-b4fc-f6870f5f210a-kube-api-access-l7n6x\") pod \"watcher-db-create-5lbzf\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.574326 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74857ae3-5a84-4d38-9962-1836e71789da-operator-scripts\") pod \"watcher-b1fc-account-create-update-87rmn\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.574392 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwmvx\" (UniqueName: \"kubernetes.io/projected/74857ae3-5a84-4d38-9962-1836e71789da-kube-api-access-bwmvx\") pod \"watcher-b1fc-account-create-update-87rmn\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.575269 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74857ae3-5a84-4d38-9962-1836e71789da-operator-scripts\") pod \"watcher-b1fc-account-create-update-87rmn\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.591754 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwmvx\" (UniqueName: \"kubernetes.io/projected/74857ae3-5a84-4d38-9962-1836e71789da-kube-api-access-bwmvx\") pod \"watcher-b1fc-account-create-update-87rmn\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.609928 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:35 crc kubenswrapper[4886]: I0314 08:48:35.695282 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:36 crc kubenswrapper[4886]: I0314 08:48:36.072367 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-5lbzf"] Mar 14 08:48:36 crc kubenswrapper[4886]: W0314 08:48:36.088339 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67508ffc_b9b5_4172_b4fc_f6870f5f210a.slice/crio-ea74017baf1f8c7c5d6cda075fccde300c599b73f75e8fcad489543d065b3cdc WatchSource:0}: Error finding container ea74017baf1f8c7c5d6cda075fccde300c599b73f75e8fcad489543d065b3cdc: Status 404 returned error can't find the container with id ea74017baf1f8c7c5d6cda075fccde300c599b73f75e8fcad489543d065b3cdc Mar 14 08:48:36 crc kubenswrapper[4886]: W0314 08:48:36.156657 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74857ae3_5a84_4d38_9962_1836e71789da.slice/crio-ac4778b83dbd1ef7bf60d34b484d284f6c1fd023a7b656b71b75c3864d5037d2 WatchSource:0}: Error finding container ac4778b83dbd1ef7bf60d34b484d284f6c1fd023a7b656b71b75c3864d5037d2: Status 404 returned error can't find the container with id ac4778b83dbd1ef7bf60d34b484d284f6c1fd023a7b656b71b75c3864d5037d2 Mar 14 08:48:36 crc kubenswrapper[4886]: I0314 08:48:36.159741 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-b1fc-account-create-update-87rmn"] Mar 14 08:48:36 crc kubenswrapper[4886]: I0314 08:48:36.181482 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hb7vx" event={"ID":"fd659431-d053-4d3b-a7a8-7a9b20438242","Type":"ContainerStarted","Data":"b05390b8be616204a32f0fe736695f7aaac95b3cc5823c354c42ad665d7c2ece"} Mar 14 08:48:36 crc kubenswrapper[4886]: I0314 08:48:36.182856 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5lbzf" event={"ID":"67508ffc-b9b5-4172-b4fc-f6870f5f210a","Type":"ContainerStarted","Data":"ea74017baf1f8c7c5d6cda075fccde300c599b73f75e8fcad489543d065b3cdc"} Mar 14 08:48:36 crc kubenswrapper[4886]: I0314 08:48:36.648413 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:36 crc kubenswrapper[4886]: I0314 08:48:36.980197 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.049118 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-tttcp"] Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.191746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2ed1-account-create-update-bfzn5" event={"ID":"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba","Type":"ContainerStarted","Data":"74e82094bd3d4b22c426fe08c8f80e64ea8c87015831dcc39602593851b4e016"} Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.193480 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hb7vx" event={"ID":"fd659431-d053-4d3b-a7a8-7a9b20438242","Type":"ContainerStarted","Data":"fcd41b14fe79d4b69db16593dc87b657f2dc4f2358744e06441c38f9605dc34b"} Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.195655 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5lbzf" event={"ID":"67508ffc-b9b5-4172-b4fc-f6870f5f210a","Type":"ContainerStarted","Data":"b30e8e2922a2d7dd59a8f2f469ac02d13785fdbe0a4dee671faf9451e96fdd08"} Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.197038 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-b1fc-account-create-update-87rmn" event={"ID":"74857ae3-5a84-4d38-9962-1836e71789da","Type":"ContainerStarted","Data":"0456786fbe734adab210e7496efcdba51547157da386d8517e5fb3fede2a8e24"} Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.197086 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-b1fc-account-create-update-87rmn" event={"ID":"74857ae3-5a84-4d38-9962-1836e71789da","Type":"ContainerStarted","Data":"ac4778b83dbd1ef7bf60d34b484d284f6c1fd023a7b656b71b75c3864d5037d2"} Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.197180 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerName="dnsmasq-dns" containerID="cri-o://612e2d9ebcc0f8777b4898a5a31251d0b1fc3b42a4b38af0f41b8ef32e96a060" gracePeriod=10 Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.217035 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-2ed1-account-create-update-bfzn5" podStartSLOduration=3.217016784 podStartE2EDuration="3.217016784s" podCreationTimestamp="2026-03-14 08:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:37.212504718 +0000 UTC m=+1252.460956355" watchObservedRunningTime="2026-03-14 08:48:37.217016784 +0000 UTC m=+1252.465468421" Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.235559 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-5lbzf" podStartSLOduration=2.235538918 podStartE2EDuration="2.235538918s" podCreationTimestamp="2026-03-14 08:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:37.230411506 +0000 UTC m=+1252.478863143" watchObservedRunningTime="2026-03-14 08:48:37.235538918 +0000 UTC m=+1252.483990555" Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.260248 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-b1fc-account-create-update-87rmn" podStartSLOduration=2.260224014 podStartE2EDuration="2.260224014s" podCreationTimestamp="2026-03-14 08:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:37.25036551 +0000 UTC m=+1252.498817157" watchObservedRunningTime="2026-03-14 08:48:37.260224014 +0000 UTC m=+1252.508675651" Mar 14 08:48:37 crc kubenswrapper[4886]: I0314 08:48:37.268728 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-hb7vx" podStartSLOduration=3.26870538 podStartE2EDuration="3.26870538s" podCreationTimestamp="2026-03-14 08:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:37.267614829 +0000 UTC m=+1252.516066476" watchObservedRunningTime="2026-03-14 08:48:37.26870538 +0000 UTC m=+1252.517157017" Mar 14 08:48:38 crc kubenswrapper[4886]: I0314 08:48:38.233348 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerID="612e2d9ebcc0f8777b4898a5a31251d0b1fc3b42a4b38af0f41b8ef32e96a060" exitCode=0 Mar 14 08:48:38 crc kubenswrapper[4886]: I0314 08:48:38.233455 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" event={"ID":"4ece572b-edc2-46be-adff-dfaed70fadf5","Type":"ContainerDied","Data":"612e2d9ebcc0f8777b4898a5a31251d0b1fc3b42a4b38af0f41b8ef32e96a060"} Mar 14 08:48:39 crc kubenswrapper[4886]: I0314 08:48:39.251578 4886 generic.go:334] "Generic (PLEG): container finished" podID="fd659431-d053-4d3b-a7a8-7a9b20438242" containerID="fcd41b14fe79d4b69db16593dc87b657f2dc4f2358744e06441c38f9605dc34b" exitCode=0 Mar 14 08:48:39 crc kubenswrapper[4886]: I0314 08:48:39.251642 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hb7vx" event={"ID":"fd659431-d053-4d3b-a7a8-7a9b20438242","Type":"ContainerDied","Data":"fcd41b14fe79d4b69db16593dc87b657f2dc4f2358744e06441c38f9605dc34b"} Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.019170 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gk8kz"] Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.023652 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.026406 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.027795 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gk8kz"] Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.092710 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6q7\" (UniqueName: \"kubernetes.io/projected/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-kube-api-access-kg6q7\") pod \"root-account-create-update-gk8kz\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.093393 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-operator-scripts\") pod \"root-account-create-update-gk8kz\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.151557 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.199203 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtcg5\" (UniqueName: \"kubernetes.io/projected/4ece572b-edc2-46be-adff-dfaed70fadf5-kube-api-access-xtcg5\") pod \"4ece572b-edc2-46be-adff-dfaed70fadf5\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.199351 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-ovsdbserver-nb\") pod \"4ece572b-edc2-46be-adff-dfaed70fadf5\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.199466 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-config\") pod \"4ece572b-edc2-46be-adff-dfaed70fadf5\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.199494 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-dns-svc\") pod \"4ece572b-edc2-46be-adff-dfaed70fadf5\" (UID: \"4ece572b-edc2-46be-adff-dfaed70fadf5\") " Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.199858 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg6q7\" (UniqueName: \"kubernetes.io/projected/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-kube-api-access-kg6q7\") pod \"root-account-create-update-gk8kz\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.200015 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-operator-scripts\") pod \"root-account-create-update-gk8kz\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.200917 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-operator-scripts\") pod \"root-account-create-update-gk8kz\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.217118 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ece572b-edc2-46be-adff-dfaed70fadf5-kube-api-access-xtcg5" (OuterVolumeSpecName: "kube-api-access-xtcg5") pod "4ece572b-edc2-46be-adff-dfaed70fadf5" (UID: "4ece572b-edc2-46be-adff-dfaed70fadf5"). InnerVolumeSpecName "kube-api-access-xtcg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.230846 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg6q7\" (UniqueName: \"kubernetes.io/projected/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-kube-api-access-kg6q7\") pod \"root-account-create-update-gk8kz\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.248389 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ece572b-edc2-46be-adff-dfaed70fadf5" (UID: "4ece572b-edc2-46be-adff-dfaed70fadf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.249099 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ece572b-edc2-46be-adff-dfaed70fadf5" (UID: "4ece572b-edc2-46be-adff-dfaed70fadf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.260077 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-config" (OuterVolumeSpecName: "config") pod "4ece572b-edc2-46be-adff-dfaed70fadf5" (UID: "4ece572b-edc2-46be-adff-dfaed70fadf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.263912 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerStarted","Data":"c57bec255019de882019b2ad341a2a577a243d602db0e581ed662faf20a1d261"} Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.266073 4886 generic.go:334] "Generic (PLEG): container finished" podID="18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" containerID="74e82094bd3d4b22c426fe08c8f80e64ea8c87015831dcc39602593851b4e016" exitCode=0 Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.266173 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2ed1-account-create-update-bfzn5" event={"ID":"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba","Type":"ContainerDied","Data":"74e82094bd3d4b22c426fe08c8f80e64ea8c87015831dcc39602593851b4e016"} Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.269835 4886 generic.go:334] "Generic (PLEG): container finished" podID="67508ffc-b9b5-4172-b4fc-f6870f5f210a" containerID="b30e8e2922a2d7dd59a8f2f469ac02d13785fdbe0a4dee671faf9451e96fdd08" exitCode=0 Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.269980 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5lbzf" event={"ID":"67508ffc-b9b5-4172-b4fc-f6870f5f210a","Type":"ContainerDied","Data":"b30e8e2922a2d7dd59a8f2f469ac02d13785fdbe0a4dee671faf9451e96fdd08"} Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.272104 4886 generic.go:334] "Generic (PLEG): container finished" podID="74857ae3-5a84-4d38-9962-1836e71789da" containerID="0456786fbe734adab210e7496efcdba51547157da386d8517e5fb3fede2a8e24" exitCode=0 Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.272176 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-b1fc-account-create-update-87rmn" event={"ID":"74857ae3-5a84-4d38-9962-1836e71789da","Type":"ContainerDied","Data":"0456786fbe734adab210e7496efcdba51547157da386d8517e5fb3fede2a8e24"} Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.274316 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.275021 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-tttcp" event={"ID":"4ece572b-edc2-46be-adff-dfaed70fadf5","Type":"ContainerDied","Data":"8acebf0ac5c4e8b5dede9e2cbcbbd9f121720f4a01169f3eef5803553c6a8ac0"} Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.275063 4886 scope.go:117] "RemoveContainer" containerID="612e2d9ebcc0f8777b4898a5a31251d0b1fc3b42a4b38af0f41b8ef32e96a060" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.303030 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.303116 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.303142 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ece572b-edc2-46be-adff-dfaed70fadf5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.303155 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtcg5\" (UniqueName: \"kubernetes.io/projected/4ece572b-edc2-46be-adff-dfaed70fadf5-kube-api-access-xtcg5\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.343366 4886 scope.go:117] "RemoveContainer" containerID="c6ea87b541903a94e42f2b927667de86dfdecb56a2c19654a7c040f7340e75d4" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.346181 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-tttcp"] Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.354999 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-tttcp"] Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.436368 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.585335 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.709961 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94kr6\" (UniqueName: \"kubernetes.io/projected/fd659431-d053-4d3b-a7a8-7a9b20438242-kube-api-access-94kr6\") pod \"fd659431-d053-4d3b-a7a8-7a9b20438242\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.710345 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd659431-d053-4d3b-a7a8-7a9b20438242-operator-scripts\") pod \"fd659431-d053-4d3b-a7a8-7a9b20438242\" (UID: \"fd659431-d053-4d3b-a7a8-7a9b20438242\") " Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.716314 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd659431-d053-4d3b-a7a8-7a9b20438242-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd659431-d053-4d3b-a7a8-7a9b20438242" (UID: "fd659431-d053-4d3b-a7a8-7a9b20438242"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.718939 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd659431-d053-4d3b-a7a8-7a9b20438242-kube-api-access-94kr6" (OuterVolumeSpecName: "kube-api-access-94kr6") pod "fd659431-d053-4d3b-a7a8-7a9b20438242" (UID: "fd659431-d053-4d3b-a7a8-7a9b20438242"). InnerVolumeSpecName "kube-api-access-94kr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.813607 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94kr6\" (UniqueName: \"kubernetes.io/projected/fd659431-d053-4d3b-a7a8-7a9b20438242-kube-api-access-94kr6\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.813676 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd659431-d053-4d3b-a7a8-7a9b20438242-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:40 crc kubenswrapper[4886]: I0314 08:48:40.924218 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gk8kz"] Mar 14 08:48:40 crc kubenswrapper[4886]: W0314 08:48:40.929881 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6386cf32_15ff_4f2c_ae81_df8cd9c8b793.slice/crio-ba0248390c39520d2d29bfa5b7bb75bbd69ede97cdbdbc693f6df44b7c7cd8bf WatchSource:0}: Error finding container ba0248390c39520d2d29bfa5b7bb75bbd69ede97cdbdbc693f6df44b7c7cd8bf: Status 404 returned error can't find the container with id ba0248390c39520d2d29bfa5b7bb75bbd69ede97cdbdbc693f6df44b7c7cd8bf Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.287736 4886 generic.go:334] "Generic (PLEG): container finished" podID="8658a67d-fe04-40af-a495-bb3d50c9a9db" containerID="d9f9b1521c81cf83768d6cf6143d704fd1e9331a6b1b6c66ed397f33d459e912" exitCode=0 Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.287814 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4mxt" event={"ID":"8658a67d-fe04-40af-a495-bb3d50c9a9db","Type":"ContainerDied","Data":"d9f9b1521c81cf83768d6cf6143d704fd1e9331a6b1b6c66ed397f33d459e912"} Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.291219 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk8kz" event={"ID":"6386cf32-15ff-4f2c-ae81-df8cd9c8b793","Type":"ContainerStarted","Data":"5e8eda8bba47b1698063a394f01c4cc5e37841d35ab22fe71b1c947e4fc8e0f4"} Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.291259 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk8kz" event={"ID":"6386cf32-15ff-4f2c-ae81-df8cd9c8b793","Type":"ContainerStarted","Data":"ba0248390c39520d2d29bfa5b7bb75bbd69ede97cdbdbc693f6df44b7c7cd8bf"} Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.297491 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hb7vx" event={"ID":"fd659431-d053-4d3b-a7a8-7a9b20438242","Type":"ContainerDied","Data":"b05390b8be616204a32f0fe736695f7aaac95b3cc5823c354c42ad665d7c2ece"} Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.297701 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b05390b8be616204a32f0fe736695f7aaac95b3cc5823c354c42ad665d7c2ece" Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.297777 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hb7vx" Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.337098 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gk8kz" podStartSLOduration=2.337078374 podStartE2EDuration="2.337078374s" podCreationTimestamp="2026-03-14 08:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:41.333029032 +0000 UTC m=+1256.581480689" watchObservedRunningTime="2026-03-14 08:48:41.337078374 +0000 UTC m=+1256.585530001" Mar 14 08:48:41 crc kubenswrapper[4886]: I0314 08:48:41.458011 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" path="/var/lib/kubelet/pods/4ece572b-edc2-46be-adff-dfaed70fadf5/volumes" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.117507 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.125647 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.136521 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.252218 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-operator-scripts\") pod \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.252265 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dffl\" (UniqueName: \"kubernetes.io/projected/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-kube-api-access-9dffl\") pod \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\" (UID: \"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.252284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74857ae3-5a84-4d38-9962-1836e71789da-operator-scripts\") pod \"74857ae3-5a84-4d38-9962-1836e71789da\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.252351 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7n6x\" (UniqueName: \"kubernetes.io/projected/67508ffc-b9b5-4172-b4fc-f6870f5f210a-kube-api-access-l7n6x\") pod \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.252374 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67508ffc-b9b5-4172-b4fc-f6870f5f210a-operator-scripts\") pod \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\" (UID: \"67508ffc-b9b5-4172-b4fc-f6870f5f210a\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.252415 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwmvx\" (UniqueName: \"kubernetes.io/projected/74857ae3-5a84-4d38-9962-1836e71789da-kube-api-access-bwmvx\") pod \"74857ae3-5a84-4d38-9962-1836e71789da\" (UID: \"74857ae3-5a84-4d38-9962-1836e71789da\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.253379 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74857ae3-5a84-4d38-9962-1836e71789da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74857ae3-5a84-4d38-9962-1836e71789da" (UID: "74857ae3-5a84-4d38-9962-1836e71789da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.253420 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" (UID: "18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.253422 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67508ffc-b9b5-4172-b4fc-f6870f5f210a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67508ffc-b9b5-4172-b4fc-f6870f5f210a" (UID: "67508ffc-b9b5-4172-b4fc-f6870f5f210a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.260359 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-kube-api-access-9dffl" (OuterVolumeSpecName: "kube-api-access-9dffl") pod "18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" (UID: "18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba"). InnerVolumeSpecName "kube-api-access-9dffl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.261788 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74857ae3-5a84-4d38-9962-1836e71789da-kube-api-access-bwmvx" (OuterVolumeSpecName: "kube-api-access-bwmvx") pod "74857ae3-5a84-4d38-9962-1836e71789da" (UID: "74857ae3-5a84-4d38-9962-1836e71789da"). InnerVolumeSpecName "kube-api-access-bwmvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.263023 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67508ffc-b9b5-4172-b4fc-f6870f5f210a-kube-api-access-l7n6x" (OuterVolumeSpecName: "kube-api-access-l7n6x") pod "67508ffc-b9b5-4172-b4fc-f6870f5f210a" (UID: "67508ffc-b9b5-4172-b4fc-f6870f5f210a"). InnerVolumeSpecName "kube-api-access-l7n6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.338493 4886 generic.go:334] "Generic (PLEG): container finished" podID="68bf3729-3dcf-4881-814b-b6af3060336e" containerID="70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da" exitCode=0 Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.338592 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"68bf3729-3dcf-4881-814b-b6af3060336e","Type":"ContainerDied","Data":"70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da"} Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.345485 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-b1fc-account-create-update-87rmn" event={"ID":"74857ae3-5a84-4d38-9962-1836e71789da","Type":"ContainerDied","Data":"ac4778b83dbd1ef7bf60d34b484d284f6c1fd023a7b656b71b75c3864d5037d2"} Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.345532 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4778b83dbd1ef7bf60d34b484d284f6c1fd023a7b656b71b75c3864d5037d2" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.345610 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-b1fc-account-create-update-87rmn" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.359602 4886 generic.go:334] "Generic (PLEG): container finished" podID="6386cf32-15ff-4f2c-ae81-df8cd9c8b793" containerID="5e8eda8bba47b1698063a394f01c4cc5e37841d35ab22fe71b1c947e4fc8e0f4" exitCode=0 Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.359881 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk8kz" event={"ID":"6386cf32-15ff-4f2c-ae81-df8cd9c8b793","Type":"ContainerDied","Data":"5e8eda8bba47b1698063a394f01c4cc5e37841d35ab22fe71b1c947e4fc8e0f4"} Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.366441 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2ed1-account-create-update-bfzn5" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.366569 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2ed1-account-create-update-bfzn5" event={"ID":"18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba","Type":"ContainerDied","Data":"38eef85a482c0f76bbd3bdde380be270d45dbd24156d22934f35ebc10de91f3c"} Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.366597 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38eef85a482c0f76bbd3bdde380be270d45dbd24156d22934f35ebc10de91f3c" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.366677 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.367718 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dffl\" (UniqueName: \"kubernetes.io/projected/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba-kube-api-access-9dffl\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.368463 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74857ae3-5a84-4d38-9962-1836e71789da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.368546 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7n6x\" (UniqueName: \"kubernetes.io/projected/67508ffc-b9b5-4172-b4fc-f6870f5f210a-kube-api-access-l7n6x\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.368570 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67508ffc-b9b5-4172-b4fc-f6870f5f210a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.368605 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwmvx\" (UniqueName: \"kubernetes.io/projected/74857ae3-5a84-4d38-9962-1836e71789da-kube-api-access-bwmvx\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.383236 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5lbzf" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.383401 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5lbzf" event={"ID":"67508ffc-b9b5-4172-b4fc-f6870f5f210a","Type":"ContainerDied","Data":"ea74017baf1f8c7c5d6cda075fccde300c599b73f75e8fcad489543d065b3cdc"} Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.384226 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea74017baf1f8c7c5d6cda075fccde300c599b73f75e8fcad489543d065b3cdc" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.390025 4886 generic.go:334] "Generic (PLEG): container finished" podID="c08d9078-9b3a-492a-92db-3096453d49f8" containerID="548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4" exitCode=0 Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.390409 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c08d9078-9b3a-492a-92db-3096453d49f8","Type":"ContainerDied","Data":"548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4"} Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.470635 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.480233 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b13c5527-179a-440c-bca1-379cab773854-etc-swift\") pod \"swift-storage-0\" (UID: \"b13c5527-179a-440c-bca1-379cab773854\") " pod="openstack/swift-storage-0" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.542626 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.832265 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.879629 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-scripts\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.879737 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-dispersionconf\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.879809 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-ring-data-devices\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.879905 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-combined-ca-bundle\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.879969 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8658a67d-fe04-40af-a495-bb3d50c9a9db-etc-swift\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.880023 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-swiftconf\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.880092 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g79bm\" (UniqueName: \"kubernetes.io/projected/8658a67d-fe04-40af-a495-bb3d50c9a9db-kube-api-access-g79bm\") pod \"8658a67d-fe04-40af-a495-bb3d50c9a9db\" (UID: \"8658a67d-fe04-40af-a495-bb3d50c9a9db\") " Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.880990 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.882207 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8658a67d-fe04-40af-a495-bb3d50c9a9db-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.882879 4886 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.882901 4886 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8658a67d-fe04-40af-a495-bb3d50c9a9db-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.888651 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8658a67d-fe04-40af-a495-bb3d50c9a9db-kube-api-access-g79bm" (OuterVolumeSpecName: "kube-api-access-g79bm") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "kube-api-access-g79bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.892226 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.911884 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-scripts" (OuterVolumeSpecName: "scripts") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.929542 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.933917 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8658a67d-fe04-40af-a495-bb3d50c9a9db" (UID: "8658a67d-fe04-40af-a495-bb3d50c9a9db"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.984392 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.984430 4886 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.984444 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g79bm\" (UniqueName: \"kubernetes.io/projected/8658a67d-fe04-40af-a495-bb3d50c9a9db-kube-api-access-g79bm\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.984458 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8658a67d-fe04-40af-a495-bb3d50c9a9db-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:42 crc kubenswrapper[4886]: I0314 08:48:42.984473 4886 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8658a67d-fe04-40af-a495-bb3d50c9a9db-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.091319 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lqwf5"] Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.091996 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerName="init" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092013 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerName="init" Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.092033 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67508ffc-b9b5-4172-b4fc-f6870f5f210a" containerName="mariadb-database-create" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092040 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="67508ffc-b9b5-4172-b4fc-f6870f5f210a" containerName="mariadb-database-create" Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.092060 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092067 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.092078 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8658a67d-fe04-40af-a495-bb3d50c9a9db" containerName="swift-ring-rebalance" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092084 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8658a67d-fe04-40af-a495-bb3d50c9a9db" containerName="swift-ring-rebalance" Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.092097 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74857ae3-5a84-4d38-9962-1836e71789da" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092103 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="74857ae3-5a84-4d38-9962-1836e71789da" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.092110 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd659431-d053-4d3b-a7a8-7a9b20438242" containerName="mariadb-database-create" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092118 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd659431-d053-4d3b-a7a8-7a9b20438242" containerName="mariadb-database-create" Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.092176 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerName="dnsmasq-dns" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092183 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerName="dnsmasq-dns" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092356 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ece572b-edc2-46be-adff-dfaed70fadf5" containerName="dnsmasq-dns" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092368 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd659431-d053-4d3b-a7a8-7a9b20438242" containerName="mariadb-database-create" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092377 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092393 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="74857ae3-5a84-4d38-9962-1836e71789da" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092401 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8658a67d-fe04-40af-a495-bb3d50c9a9db" containerName="swift-ring-rebalance" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.092411 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="67508ffc-b9b5-4172-b4fc-f6870f5f210a" containerName="mariadb-database-create" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.093013 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.107626 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lqwf5"] Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.148348 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.187424 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c05b39b-9236-4a8a-ab68-c424153678a6-operator-scripts\") pod \"glance-db-create-lqwf5\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.187780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7ncz\" (UniqueName: \"kubernetes.io/projected/4c05b39b-9236-4a8a-ab68-c424153678a6-kube-api-access-p7ncz\") pod \"glance-db-create-lqwf5\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.201869 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d883-account-create-update-98qk2"] Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.202842 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.204710 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.213112 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d883-account-create-update-98qk2"] Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.289268 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c05b39b-9236-4a8a-ab68-c424153678a6-operator-scripts\") pod \"glance-db-create-lqwf5\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.289408 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nlm\" (UniqueName: \"kubernetes.io/projected/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-kube-api-access-g7nlm\") pod \"glance-d883-account-create-update-98qk2\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.289437 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7ncz\" (UniqueName: \"kubernetes.io/projected/4c05b39b-9236-4a8a-ab68-c424153678a6-kube-api-access-p7ncz\") pod \"glance-db-create-lqwf5\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.289855 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-operator-scripts\") pod \"glance-d883-account-create-update-98qk2\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.290022 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c05b39b-9236-4a8a-ab68-c424153678a6-operator-scripts\") pod \"glance-db-create-lqwf5\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.306549 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7ncz\" (UniqueName: \"kubernetes.io/projected/4c05b39b-9236-4a8a-ab68-c424153678a6-kube-api-access-p7ncz\") pod \"glance-db-create-lqwf5\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.391400 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nlm\" (UniqueName: \"kubernetes.io/projected/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-kube-api-access-g7nlm\") pod \"glance-d883-account-create-update-98qk2\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.391488 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-operator-scripts\") pod \"glance-d883-account-create-update-98qk2\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.392498 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-operator-scripts\") pod \"glance-d883-account-create-update-98qk2\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.400204 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"68bf3729-3dcf-4881-814b-b6af3060336e","Type":"ContainerStarted","Data":"c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e"} Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.400452 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.401750 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"eb09a81d7c5a89d053e463f28f3e83d765680815a37623f84b8d6bffc21b556d"} Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.404207 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerStarted","Data":"98bca5fb4b30d948c0f6bd27d2746cfb226c5c1dd57a1d7f14dae009417085bf"} Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.406507 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c08d9078-9b3a-492a-92db-3096453d49f8","Type":"ContainerStarted","Data":"93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0"} Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.406739 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.408108 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4mxt" event={"ID":"8658a67d-fe04-40af-a495-bb3d50c9a9db","Type":"ContainerDied","Data":"a6db0da28bf0d883b60088ac74a52c26376ed503960ef560774c2f2527f27ecc"} Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.408157 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4mxt" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.408164 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6db0da28bf0d883b60088ac74a52c26376ed503960ef560774c2f2527f27ecc" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.408350 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.414783 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nlm\" (UniqueName: \"kubernetes.io/projected/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-kube-api-access-g7nlm\") pod \"glance-d883-account-create-update-98qk2\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.447216 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.252261402 podStartE2EDuration="1m5.447197271s" podCreationTimestamp="2026-03-14 08:47:38 +0000 UTC" firstStartedPulling="2026-03-14 08:47:40.769096168 +0000 UTC m=+1196.017547805" lastFinishedPulling="2026-03-14 08:48:07.964032037 +0000 UTC m=+1223.212483674" observedRunningTime="2026-03-14 08:48:43.438265303 +0000 UTC m=+1258.686716940" watchObservedRunningTime="2026-03-14 08:48:43.447197271 +0000 UTC m=+1258.695648908" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.472258 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.702178254 podStartE2EDuration="1m5.472230417s" podCreationTimestamp="2026-03-14 08:47:38 +0000 UTC" firstStartedPulling="2026-03-14 08:47:40.108256679 +0000 UTC m=+1195.356708316" lastFinishedPulling="2026-03-14 08:48:06.878308802 +0000 UTC m=+1222.126760479" observedRunningTime="2026-03-14 08:48:43.465058597 +0000 UTC m=+1258.713510254" watchObservedRunningTime="2026-03-14 08:48:43.472230417 +0000 UTC m=+1258.720682064" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.523643 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.810212 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.901151 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-operator-scripts\") pod \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.901277 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg6q7\" (UniqueName: \"kubernetes.io/projected/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-kube-api-access-kg6q7\") pod \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\" (UID: \"6386cf32-15ff-4f2c-ae81-df8cd9c8b793\") " Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.902843 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6386cf32-15ff-4f2c-ae81-df8cd9c8b793" (UID: "6386cf32-15ff-4f2c-ae81-df8cd9c8b793"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.920575 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-kube-api-access-kg6q7" (OuterVolumeSpecName: "kube-api-access-kg6q7") pod "6386cf32-15ff-4f2c-ae81-df8cd9c8b793" (UID: "6386cf32-15ff-4f2c-ae81-df8cd9c8b793"). InnerVolumeSpecName "kube-api-access-kg6q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.927041 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fjk6q"] Mar 14 08:48:43 crc kubenswrapper[4886]: E0314 08:48:43.927397 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6386cf32-15ff-4f2c-ae81-df8cd9c8b793" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.927415 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6386cf32-15ff-4f2c-ae81-df8cd9c8b793" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.927574 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6386cf32-15ff-4f2c-ae81-df8cd9c8b793" containerName="mariadb-account-create-update" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.928079 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.945412 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fjk6q"] Mar 14 08:48:43 crc kubenswrapper[4886]: I0314 08:48:43.973415 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lqwf5"] Mar 14 08:48:43 crc kubenswrapper[4886]: W0314 08:48:43.977901 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c05b39b_9236_4a8a_ab68_c424153678a6.slice/crio-5e0e59799ffa595cdd211327a1775d63447c7ca149ca37dd7572e681871acaf6 WatchSource:0}: Error finding container 5e0e59799ffa595cdd211327a1775d63447c7ca149ca37dd7572e681871acaf6: Status 404 returned error can't find the container with id 5e0e59799ffa595cdd211327a1775d63447c7ca149ca37dd7572e681871acaf6 Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.003069 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqjm\" (UniqueName: \"kubernetes.io/projected/11a9c950-955d-4afd-88c9-9c705ef619b6-kube-api-access-qrqjm\") pod \"keystone-db-create-fjk6q\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.003337 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a9c950-955d-4afd-88c9-9c705ef619b6-operator-scripts\") pod \"keystone-db-create-fjk6q\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.003545 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg6q7\" (UniqueName: \"kubernetes.io/projected/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-kube-api-access-kg6q7\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.003602 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6386cf32-15ff-4f2c-ae81-df8cd9c8b793-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.031714 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ad06-account-create-update-mpb4t"] Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.033180 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.035075 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.053914 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ad06-account-create-update-mpb4t"] Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.082325 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d883-account-create-update-98qk2"] Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.104651 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfkpj\" (UniqueName: \"kubernetes.io/projected/d7c9025b-c8dc-498d-a0ea-b8770e22af45-kube-api-access-xfkpj\") pod \"keystone-ad06-account-create-update-mpb4t\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.104718 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqjm\" (UniqueName: \"kubernetes.io/projected/11a9c950-955d-4afd-88c9-9c705ef619b6-kube-api-access-qrqjm\") pod \"keystone-db-create-fjk6q\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.104782 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a9c950-955d-4afd-88c9-9c705ef619b6-operator-scripts\") pod \"keystone-db-create-fjk6q\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.104822 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c9025b-c8dc-498d-a0ea-b8770e22af45-operator-scripts\") pod \"keystone-ad06-account-create-update-mpb4t\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.105861 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a9c950-955d-4afd-88c9-9c705ef619b6-operator-scripts\") pod \"keystone-db-create-fjk6q\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.128459 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqjm\" (UniqueName: \"kubernetes.io/projected/11a9c950-955d-4afd-88c9-9c705ef619b6-kube-api-access-qrqjm\") pod \"keystone-db-create-fjk6q\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.206754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c9025b-c8dc-498d-a0ea-b8770e22af45-operator-scripts\") pod \"keystone-ad06-account-create-update-mpb4t\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.206899 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfkpj\" (UniqueName: \"kubernetes.io/projected/d7c9025b-c8dc-498d-a0ea-b8770e22af45-kube-api-access-xfkpj\") pod \"keystone-ad06-account-create-update-mpb4t\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.207577 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c9025b-c8dc-498d-a0ea-b8770e22af45-operator-scripts\") pod \"keystone-ad06-account-create-update-mpb4t\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.227373 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfkpj\" (UniqueName: \"kubernetes.io/projected/d7c9025b-c8dc-498d-a0ea-b8770e22af45-kube-api-access-xfkpj\") pod \"keystone-ad06-account-create-update-mpb4t\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.254661 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.310993 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.419562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d883-account-create-update-98qk2" event={"ID":"1bd30816-7ba6-49db-8bfa-52b31cbf4de5","Type":"ContainerStarted","Data":"a41e05108f07892a98af2791a61ab423c0dd2794f1f4637fd0da712c65028432"} Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.421528 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gk8kz" event={"ID":"6386cf32-15ff-4f2c-ae81-df8cd9c8b793","Type":"ContainerDied","Data":"ba0248390c39520d2d29bfa5b7bb75bbd69ede97cdbdbc693f6df44b7c7cd8bf"} Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.421565 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0248390c39520d2d29bfa5b7bb75bbd69ede97cdbdbc693f6df44b7c7cd8bf" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.421560 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gk8kz" Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.424311 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqwf5" event={"ID":"4c05b39b-9236-4a8a-ab68-c424153678a6","Type":"ContainerStarted","Data":"b33664b6d7ee1992fd148aa80d13d98502abcdea85b1431c2b1ef2a2fea883a7"} Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.424333 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqwf5" event={"ID":"4c05b39b-9236-4a8a-ab68-c424153678a6","Type":"ContainerStarted","Data":"5e0e59799ffa595cdd211327a1775d63447c7ca149ca37dd7572e681871acaf6"} Mar 14 08:48:44 crc kubenswrapper[4886]: I0314 08:48:44.455518 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-lqwf5" podStartSLOduration=1.4555043859999999 podStartE2EDuration="1.455504386s" podCreationTimestamp="2026-03-14 08:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:44.449103628 +0000 UTC m=+1259.697555265" watchObservedRunningTime="2026-03-14 08:48:44.455504386 +0000 UTC m=+1259.703956023" Mar 14 08:48:45 crc kubenswrapper[4886]: I0314 08:48:45.438924 4886 generic.go:334] "Generic (PLEG): container finished" podID="4c05b39b-9236-4a8a-ab68-c424153678a6" containerID="b33664b6d7ee1992fd148aa80d13d98502abcdea85b1431c2b1ef2a2fea883a7" exitCode=0 Mar 14 08:48:45 crc kubenswrapper[4886]: I0314 08:48:45.444493 4886 generic.go:334] "Generic (PLEG): container finished" podID="1bd30816-7ba6-49db-8bfa-52b31cbf4de5" containerID="fc19ed93114fc4349e21a148501e031903bc6d0724eb4aebbd13c07f631f53d9" exitCode=0 Mar 14 08:48:45 crc kubenswrapper[4886]: I0314 08:48:45.454839 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqwf5" event={"ID":"4c05b39b-9236-4a8a-ab68-c424153678a6","Type":"ContainerDied","Data":"b33664b6d7ee1992fd148aa80d13d98502abcdea85b1431c2b1ef2a2fea883a7"} Mar 14 08:48:45 crc kubenswrapper[4886]: I0314 08:48:45.454891 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d883-account-create-update-98qk2" event={"ID":"1bd30816-7ba6-49db-8bfa-52b31cbf4de5","Type":"ContainerDied","Data":"fc19ed93114fc4349e21a148501e031903bc6d0724eb4aebbd13c07f631f53d9"} Mar 14 08:48:46 crc kubenswrapper[4886]: I0314 08:48:46.463598 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gk8kz"] Mar 14 08:48:46 crc kubenswrapper[4886]: I0314 08:48:46.472191 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gk8kz"] Mar 14 08:48:47 crc kubenswrapper[4886]: I0314 08:48:47.033200 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 14 08:48:47 crc kubenswrapper[4886]: I0314 08:48:47.242965 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:47 crc kubenswrapper[4886]: I0314 08:48:47.243218 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.371926 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7ncz\" (UniqueName: \"kubernetes.io/projected/4c05b39b-9236-4a8a-ab68-c424153678a6-kube-api-access-p7ncz\") pod \"4c05b39b-9236-4a8a-ab68-c424153678a6\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.372000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c05b39b-9236-4a8a-ab68-c424153678a6-operator-scripts\") pod \"4c05b39b-9236-4a8a-ab68-c424153678a6\" (UID: \"4c05b39b-9236-4a8a-ab68-c424153678a6\") " Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.372690 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c05b39b-9236-4a8a-ab68-c424153678a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c05b39b-9236-4a8a-ab68-c424153678a6" (UID: "4c05b39b-9236-4a8a-ab68-c424153678a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.372797 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-operator-scripts\") pod \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.373207 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bd30816-7ba6-49db-8bfa-52b31cbf4de5" (UID: "1bd30816-7ba6-49db-8bfa-52b31cbf4de5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.373337 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7nlm\" (UniqueName: \"kubernetes.io/projected/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-kube-api-access-g7nlm\") pod \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\" (UID: \"1bd30816-7ba6-49db-8bfa-52b31cbf4de5\") " Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.374465 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c05b39b-9236-4a8a-ab68-c424153678a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.374480 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.376035 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c05b39b-9236-4a8a-ab68-c424153678a6-kube-api-access-p7ncz" (OuterVolumeSpecName: "kube-api-access-p7ncz") pod "4c05b39b-9236-4a8a-ab68-c424153678a6" (UID: "4c05b39b-9236-4a8a-ab68-c424153678a6"). InnerVolumeSpecName "kube-api-access-p7ncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.378333 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-kube-api-access-g7nlm" (OuterVolumeSpecName: "kube-api-access-g7nlm") pod "1bd30816-7ba6-49db-8bfa-52b31cbf4de5" (UID: "1bd30816-7ba6-49db-8bfa-52b31cbf4de5"). InnerVolumeSpecName "kube-api-access-g7nlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.434738 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6386cf32-15ff-4f2c-ae81-df8cd9c8b793" path="/var/lib/kubelet/pods/6386cf32-15ff-4f2c-ae81-df8cd9c8b793/volumes" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.476781 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7nlm\" (UniqueName: \"kubernetes.io/projected/1bd30816-7ba6-49db-8bfa-52b31cbf4de5-kube-api-access-g7nlm\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.482091 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7ncz\" (UniqueName: \"kubernetes.io/projected/4c05b39b-9236-4a8a-ab68-c424153678a6-kube-api-access-p7ncz\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.497390 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"2623aaa84cae1f9a305ae97f65f57ac57c4d0d3b993a2b4e641077018621ab3c"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.500308 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerStarted","Data":"350bee66930c3c908370e7b9c5cb15c68693d80d1ebe7ee35949652765be4b21"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.502891 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqwf5" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.502988 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqwf5" event={"ID":"4c05b39b-9236-4a8a-ab68-c424153678a6","Type":"ContainerDied","Data":"5e0e59799ffa595cdd211327a1775d63447c7ca149ca37dd7572e681871acaf6"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.503032 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0e59799ffa595cdd211327a1775d63447c7ca149ca37dd7572e681871acaf6" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.505197 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d883-account-create-update-98qk2" event={"ID":"1bd30816-7ba6-49db-8bfa-52b31cbf4de5","Type":"ContainerDied","Data":"a41e05108f07892a98af2791a61ab423c0dd2794f1f4637fd0da712c65028432"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.505226 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a41e05108f07892a98af2791a61ab423c0dd2794f1f4637fd0da712c65028432" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.505296 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d883-account-create-update-98qk2" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:47.533372 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.977067755 podStartE2EDuration="1m2.53335091s" podCreationTimestamp="2026-03-14 08:47:45 +0000 UTC" firstStartedPulling="2026-03-14 08:48:08.596046029 +0000 UTC m=+1223.844497666" lastFinishedPulling="2026-03-14 08:48:47.152329174 +0000 UTC m=+1262.400780821" observedRunningTime="2026-03-14 08:48:47.528206857 +0000 UTC m=+1262.776658504" watchObservedRunningTime="2026-03-14 08:48:47.53335091 +0000 UTC m=+1262.781802547" Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:48.515671 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"532c34f89c65044922e987599453dc6d210f73f9fdd67ac3e87100b43641fd16"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:48.515719 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"fe88dc82321cd609960cc5a80bb5e28702facd7d74a94cb9c57bf20f545a42b8"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:48.515733 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"58e2258d7b26df0417e22d6918a93c6cb8cf664a59aedae901d276d5d169dcfa"} Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:48.522221 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9k99l" podUID="dee9c638-1703-4b56-b366-13c6746d035c" containerName="ovn-controller" probeResult="failure" output=< Mar 14 08:48:48 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 08:48:48 crc kubenswrapper[4886]: > Mar 14 08:48:48 crc kubenswrapper[4886]: I0314 08:48:48.905523 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fjk6q"] Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.018563 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ad06-account-create-update-mpb4t"] Mar 14 08:48:49 crc kubenswrapper[4886]: W0314 08:48:49.024424 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7c9025b_c8dc_498d_a0ea_b8770e22af45.slice/crio-2cace4b0636e1832d2f6a4054fa3d20f37f9ef8860013e9f491993a79f05507a WatchSource:0}: Error finding container 2cace4b0636e1832d2f6a4054fa3d20f37f9ef8860013e9f491993a79f05507a: Status 404 returned error can't find the container with id 2cace4b0636e1832d2f6a4054fa3d20f37f9ef8860013e9f491993a79f05507a Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.040511 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.526464 4886 generic.go:334] "Generic (PLEG): container finished" podID="d7c9025b-c8dc-498d-a0ea-b8770e22af45" containerID="519858d0438c006aa8bb843bd113095c569f5380af74f59082ef80b372084f76" exitCode=0 Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.526639 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ad06-account-create-update-mpb4t" event={"ID":"d7c9025b-c8dc-498d-a0ea-b8770e22af45","Type":"ContainerDied","Data":"519858d0438c006aa8bb843bd113095c569f5380af74f59082ef80b372084f76"} Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.526822 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ad06-account-create-update-mpb4t" event={"ID":"d7c9025b-c8dc-498d-a0ea-b8770e22af45","Type":"ContainerStarted","Data":"2cace4b0636e1832d2f6a4054fa3d20f37f9ef8860013e9f491993a79f05507a"} Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.530105 4886 generic.go:334] "Generic (PLEG): container finished" podID="11a9c950-955d-4afd-88c9-9c705ef619b6" containerID="b822ed51869e5d780121065f8994b9261798e74731ec04f6b70a39cea9109454" exitCode=0 Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.530165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjk6q" event={"ID":"11a9c950-955d-4afd-88c9-9c705ef619b6","Type":"ContainerDied","Data":"b822ed51869e5d780121065f8994b9261798e74731ec04f6b70a39cea9109454"} Mar 14 08:48:49 crc kubenswrapper[4886]: I0314 08:48:49.530193 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjk6q" event={"ID":"11a9c950-955d-4afd-88c9-9c705ef619b6","Type":"ContainerStarted","Data":"c0e708d821d9eb55b7afc851a1d5a64ed294980fd2fa7bc9a4211582bc00c787"} Mar 14 08:48:50 crc kubenswrapper[4886]: I0314 08:48:50.544320 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"1227c20646da8eef193b1974389cb532ca7e76499ae67ed4d0b9c93caf3e78a6"} Mar 14 08:48:50 crc kubenswrapper[4886]: I0314 08:48:50.546622 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"40a759c83176650efdeb7a39d754e22e13fe49065bcc5331ef928912d00290a9"} Mar 14 08:48:50 crc kubenswrapper[4886]: I0314 08:48:50.546721 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"ac9e9427c086eab276e931c334efaaab01ad0c7ede039c9bbfafac726f219352"} Mar 14 08:48:50 crc kubenswrapper[4886]: I0314 08:48:50.904842 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:50 crc kubenswrapper[4886]: I0314 08:48:50.934282 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.075848 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c9025b-c8dc-498d-a0ea-b8770e22af45-operator-scripts\") pod \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.075993 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrqjm\" (UniqueName: \"kubernetes.io/projected/11a9c950-955d-4afd-88c9-9c705ef619b6-kube-api-access-qrqjm\") pod \"11a9c950-955d-4afd-88c9-9c705ef619b6\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.076021 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfkpj\" (UniqueName: \"kubernetes.io/projected/d7c9025b-c8dc-498d-a0ea-b8770e22af45-kube-api-access-xfkpj\") pod \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\" (UID: \"d7c9025b-c8dc-498d-a0ea-b8770e22af45\") " Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.076048 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a9c950-955d-4afd-88c9-9c705ef619b6-operator-scripts\") pod \"11a9c950-955d-4afd-88c9-9c705ef619b6\" (UID: \"11a9c950-955d-4afd-88c9-9c705ef619b6\") " Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.076794 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c9025b-c8dc-498d-a0ea-b8770e22af45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7c9025b-c8dc-498d-a0ea-b8770e22af45" (UID: "d7c9025b-c8dc-498d-a0ea-b8770e22af45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.076798 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a9c950-955d-4afd-88c9-9c705ef619b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11a9c950-955d-4afd-88c9-9c705ef619b6" (UID: "11a9c950-955d-4afd-88c9-9c705ef619b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.081204 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c9025b-c8dc-498d-a0ea-b8770e22af45-kube-api-access-xfkpj" (OuterVolumeSpecName: "kube-api-access-xfkpj") pod "d7c9025b-c8dc-498d-a0ea-b8770e22af45" (UID: "d7c9025b-c8dc-498d-a0ea-b8770e22af45"). InnerVolumeSpecName "kube-api-access-xfkpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.081850 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a9c950-955d-4afd-88c9-9c705ef619b6-kube-api-access-qrqjm" (OuterVolumeSpecName: "kube-api-access-qrqjm") pod "11a9c950-955d-4afd-88c9-9c705ef619b6" (UID: "11a9c950-955d-4afd-88c9-9c705ef619b6"). InnerVolumeSpecName "kube-api-access-qrqjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.177379 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c9025b-c8dc-498d-a0ea-b8770e22af45-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.177409 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrqjm\" (UniqueName: \"kubernetes.io/projected/11a9c950-955d-4afd-88c9-9c705ef619b6-kube-api-access-qrqjm\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.177422 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfkpj\" (UniqueName: \"kubernetes.io/projected/d7c9025b-c8dc-498d-a0ea-b8770e22af45-kube-api-access-xfkpj\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.177431 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a9c950-955d-4afd-88c9-9c705ef619b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.482974 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jfvv5"] Mar 14 08:48:51 crc kubenswrapper[4886]: E0314 08:48:51.483599 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a9c950-955d-4afd-88c9-9c705ef619b6" containerName="mariadb-database-create" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483615 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a9c950-955d-4afd-88c9-9c705ef619b6" containerName="mariadb-database-create" Mar 14 08:48:51 crc kubenswrapper[4886]: E0314 08:48:51.483633 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd30816-7ba6-49db-8bfa-52b31cbf4de5" containerName="mariadb-account-create-update" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483640 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd30816-7ba6-49db-8bfa-52b31cbf4de5" containerName="mariadb-account-create-update" Mar 14 08:48:51 crc kubenswrapper[4886]: E0314 08:48:51.483655 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c05b39b-9236-4a8a-ab68-c424153678a6" containerName="mariadb-database-create" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483661 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c05b39b-9236-4a8a-ab68-c424153678a6" containerName="mariadb-database-create" Mar 14 08:48:51 crc kubenswrapper[4886]: E0314 08:48:51.483677 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c9025b-c8dc-498d-a0ea-b8770e22af45" containerName="mariadb-account-create-update" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483682 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c9025b-c8dc-498d-a0ea-b8770e22af45" containerName="mariadb-account-create-update" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483830 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd30816-7ba6-49db-8bfa-52b31cbf4de5" containerName="mariadb-account-create-update" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483850 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c05b39b-9236-4a8a-ab68-c424153678a6" containerName="mariadb-database-create" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483861 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a9c950-955d-4afd-88c9-9c705ef619b6" containerName="mariadb-database-create" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.483871 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c9025b-c8dc-498d-a0ea-b8770e22af45" containerName="mariadb-account-create-update" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.484459 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.491654 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jfvv5"] Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.492638 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.556264 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"f8f3b9fd765a390f32b59b27f7e351d08ee9c2d9c186feb12d605a1cf328dce8"} Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.557709 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fjk6q" event={"ID":"11a9c950-955d-4afd-88c9-9c705ef619b6","Type":"ContainerDied","Data":"c0e708d821d9eb55b7afc851a1d5a64ed294980fd2fa7bc9a4211582bc00c787"} Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.557745 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0e708d821d9eb55b7afc851a1d5a64ed294980fd2fa7bc9a4211582bc00c787" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.557727 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fjk6q" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.559237 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ad06-account-create-update-mpb4t" event={"ID":"d7c9025b-c8dc-498d-a0ea-b8770e22af45","Type":"ContainerDied","Data":"2cace4b0636e1832d2f6a4054fa3d20f37f9ef8860013e9f491993a79f05507a"} Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.559331 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cace4b0636e1832d2f6a4054fa3d20f37f9ef8860013e9f491993a79f05507a" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.559274 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ad06-account-create-update-mpb4t" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.583639 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-operator-scripts\") pod \"root-account-create-update-jfvv5\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.583710 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdm66\" (UniqueName: \"kubernetes.io/projected/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-kube-api-access-bdm66\") pod \"root-account-create-update-jfvv5\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.587177 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.685641 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-operator-scripts\") pod \"root-account-create-update-jfvv5\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.685756 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdm66\" (UniqueName: \"kubernetes.io/projected/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-kube-api-access-bdm66\") pod \"root-account-create-update-jfvv5\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.687008 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-operator-scripts\") pod \"root-account-create-update-jfvv5\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.706712 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdm66\" (UniqueName: \"kubernetes.io/projected/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-kube-api-access-bdm66\") pod \"root-account-create-update-jfvv5\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:51 crc kubenswrapper[4886]: I0314 08:48:51.809249 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.281857 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jfvv5"] Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.574984 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jfvv5" event={"ID":"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672","Type":"ContainerStarted","Data":"361605668fc2a6824295f18cabd88f91abd723bb12a6c95affd9b0cf92f83360"} Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.575362 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jfvv5" event={"ID":"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672","Type":"ContainerStarted","Data":"d25b36fd9e6d0d0fd7d62a80b9c2ba25e11b68f70c50271e11fd87329916364e"} Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.584717 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"94116085182f04bb14bc91eca1d465e72bccd51d43faf395c5b181a32e7f54c8"} Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.584784 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"04b257c49e31b0d1e1cf3f4e5bf5b0d9c9610b94271eaba8825d37e77118ce01"} Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.584798 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"e2c9783b5b14b968051e16939b6bd9187cf7feb789ec6c676446188749534bc1"} Mar 14 08:48:52 crc kubenswrapper[4886]: I0314 08:48:52.596917 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-jfvv5" podStartSLOduration=1.596894644 podStartE2EDuration="1.596894644s" podCreationTimestamp="2026-03-14 08:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:52.592467332 +0000 UTC m=+1267.840918969" watchObservedRunningTime="2026-03-14 08:48:52.596894644 +0000 UTC m=+1267.845346301" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.409049 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w889t"] Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.410636 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.413400 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.413638 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-msq2v" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.433331 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w889t"] Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.513379 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qd56\" (UniqueName: \"kubernetes.io/projected/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-kube-api-access-8qd56\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.513467 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-config-data\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.513498 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-db-sync-config-data\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.513522 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-combined-ca-bundle\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.528787 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9k99l" podUID="dee9c638-1703-4b56-b366-13c6746d035c" containerName="ovn-controller" probeResult="failure" output=< Mar 14 08:48:53 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 08:48:53 crc kubenswrapper[4886]: > Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.608012 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"dcaacc6b4357bc316ad975353aa091f877c8688ccbddeae740b9aa904babc43f"} Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.608086 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"04cd656d848f37154c61d5faeaf47856a364bca44ba0a7258b884276f7183c27"} Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.608109 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"93888c18358639afa93f04b8b31bed232617e3b7c26768b8079ed28edcd40380"} Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.608165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b13c5527-179a-440c-bca1-379cab773854","Type":"ContainerStarted","Data":"83266b99556ee3562333e9d8eb84c9f6a60d87a9f0a3fb7a8eea6e2eb8b66cd4"} Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.611731 4886 generic.go:334] "Generic (PLEG): container finished" podID="0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" containerID="361605668fc2a6824295f18cabd88f91abd723bb12a6c95affd9b0cf92f83360" exitCode=0 Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.611774 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jfvv5" event={"ID":"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672","Type":"ContainerDied","Data":"361605668fc2a6824295f18cabd88f91abd723bb12a6c95affd9b0cf92f83360"} Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.614778 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qd56\" (UniqueName: \"kubernetes.io/projected/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-kube-api-access-8qd56\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.614876 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-config-data\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.614929 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-db-sync-config-data\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.614966 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-combined-ca-bundle\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.620879 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-db-sync-config-data\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.621034 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-combined-ca-bundle\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.633387 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-config-data\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.637058 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qd56\" (UniqueName: \"kubernetes.io/projected/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-kube-api-access-8qd56\") pod \"glance-db-sync-w889t\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.664248 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.926694586 podStartE2EDuration="28.664227958s" podCreationTimestamp="2026-03-14 08:48:25 +0000 UTC" firstStartedPulling="2026-03-14 08:48:43.154825918 +0000 UTC m=+1258.403277555" lastFinishedPulling="2026-03-14 08:48:51.89235929 +0000 UTC m=+1267.140810927" observedRunningTime="2026-03-14 08:48:53.657943014 +0000 UTC m=+1268.906394661" watchObservedRunningTime="2026-03-14 08:48:53.664227958 +0000 UTC m=+1268.912679605" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.745194 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w889t" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.966973 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-kz2x6"] Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.968770 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:53 crc kubenswrapper[4886]: I0314 08:48:53.984829 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.000284 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-kz2x6"] Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.128288 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.128744 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-config\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.128776 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.128806 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.128837 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszf8\" (UniqueName: \"kubernetes.io/projected/1b496ccf-4298-4759-aacf-e115101cb90d-kube-api-access-qszf8\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.128860 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.158567 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w889t"] Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.229964 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.230026 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.230055 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qszf8\" (UniqueName: \"kubernetes.io/projected/1b496ccf-4298-4759-aacf-e115101cb90d-kube-api-access-qszf8\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.230074 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.230182 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.230235 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-config\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.231183 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-config\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.232046 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.232352 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.232387 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.233049 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.251617 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszf8\" (UniqueName: \"kubernetes.io/projected/1b496ccf-4298-4759-aacf-e115101cb90d-kube-api-access-qszf8\") pod \"dnsmasq-dns-6d5b6d6b67-kz2x6\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.302072 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.626806 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w889t" event={"ID":"2be4cce3-ae51-4d07-a9d9-ccc6152774b5","Type":"ContainerStarted","Data":"84857b6315310ce837133c33b1f54e9ea50abed181d62ac2452c687e835ceb4e"} Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.764816 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-kz2x6"] Mar 14 08:48:54 crc kubenswrapper[4886]: I0314 08:48:54.979665 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.050143 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdm66\" (UniqueName: \"kubernetes.io/projected/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-kube-api-access-bdm66\") pod \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.050354 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-operator-scripts\") pod \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\" (UID: \"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672\") " Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.054944 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" (UID: "0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.057157 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-kube-api-access-bdm66" (OuterVolumeSpecName: "kube-api-access-bdm66") pod "0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" (UID: "0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672"). InnerVolumeSpecName "kube-api-access-bdm66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.153827 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdm66\" (UniqueName: \"kubernetes.io/projected/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-kube-api-access-bdm66\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.153893 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.642635 4886 generic.go:334] "Generic (PLEG): container finished" podID="1b496ccf-4298-4759-aacf-e115101cb90d" containerID="c557d68565c355725dc833f4c34233122fe9b9c234cd250baae8028c09843185" exitCode=0 Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.642702 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" event={"ID":"1b496ccf-4298-4759-aacf-e115101cb90d","Type":"ContainerDied","Data":"c557d68565c355725dc833f4c34233122fe9b9c234cd250baae8028c09843185"} Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.642730 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" event={"ID":"1b496ccf-4298-4759-aacf-e115101cb90d","Type":"ContainerStarted","Data":"d5bca59bbdfa864a46d4b7c709a26483c8f0b99dd6683d68c6852891d1a93d37"} Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.646848 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jfvv5" event={"ID":"0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672","Type":"ContainerDied","Data":"d25b36fd9e6d0d0fd7d62a80b9c2ba25e11b68f70c50271e11fd87329916364e"} Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.646906 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25b36fd9e6d0d0fd7d62a80b9c2ba25e11b68f70c50271e11fd87329916364e" Mar 14 08:48:55 crc kubenswrapper[4886]: I0314 08:48:55.647069 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jfvv5" Mar 14 08:48:56 crc kubenswrapper[4886]: I0314 08:48:56.660546 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" event={"ID":"1b496ccf-4298-4759-aacf-e115101cb90d","Type":"ContainerStarted","Data":"1b86dd8377d7bbc4a602b82fc6e615137da41ed100266616b558bb02b1da6146"} Mar 14 08:48:56 crc kubenswrapper[4886]: I0314 08:48:56.661087 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:48:56 crc kubenswrapper[4886]: I0314 08:48:56.686657 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" podStartSLOduration=3.6866393520000003 podStartE2EDuration="3.686639352s" podCreationTimestamp="2026-03-14 08:48:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:48:56.680672336 +0000 UTC m=+1271.929124003" watchObservedRunningTime="2026-03-14 08:48:56.686639352 +0000 UTC m=+1271.935090989" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.530605 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9k99l" podUID="dee9c638-1703-4b56-b366-13c6746d035c" containerName="ovn-controller" probeResult="failure" output=< Mar 14 08:48:58 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 08:48:58 crc kubenswrapper[4886]: > Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.553618 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.557044 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-slbpm" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.810994 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9k99l-config-5cl58"] Mar 14 08:48:58 crc kubenswrapper[4886]: E0314 08:48:58.811855 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" containerName="mariadb-account-create-update" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.811967 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" containerName="mariadb-account-create-update" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.812898 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" containerName="mariadb-account-create-update" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.815326 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.817691 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.829185 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9k99l-config-5cl58"] Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.953044 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-additional-scripts\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.953103 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run-ovn\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.953189 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.953213 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhcpx\" (UniqueName: \"kubernetes.io/projected/8c653576-755d-4704-a875-5782eab04795-kube-api-access-bhcpx\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.953231 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-log-ovn\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:58 crc kubenswrapper[4886]: I0314 08:48:58.953300 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-scripts\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.054846 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-scripts\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.054901 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-additional-scripts\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.054936 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run-ovn\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.054986 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.055010 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhcpx\" (UniqueName: \"kubernetes.io/projected/8c653576-755d-4704-a875-5782eab04795-kube-api-access-bhcpx\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.055034 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-log-ovn\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.055343 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-log-ovn\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.055356 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.055465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run-ovn\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.056260 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-additional-scripts\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.057185 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-scripts\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.084480 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhcpx\" (UniqueName: \"kubernetes.io/projected/8c653576-755d-4704-a875-5782eab04795-kube-api-access-bhcpx\") pod \"ovn-controller-9k99l-config-5cl58\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.145307 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.510303 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.817366 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-jgb85"] Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.818414 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jgb85" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.822836 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-tn7ss" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.823579 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.845952 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-jgb85"] Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.893796 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-shtsx"] Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.895137 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-shtsx" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.905656 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-shtsx"] Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.975019 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24sq9\" (UniqueName: \"kubernetes.io/projected/99c6cc9d-015a-4e33-8ded-c912cb52dde2-kube-api-access-24sq9\") pod \"cinder-db-create-shtsx\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " pod="openstack/cinder-db-create-shtsx" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.975170 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-db-sync-config-data\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.975242 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-config-data\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.975350 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzvj\" (UniqueName: \"kubernetes.io/projected/fb94c19d-031e-44b6-bdaa-39141d037b36-kube-api-access-4tzvj\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.975606 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99c6cc9d-015a-4e33-8ded-c912cb52dde2-operator-scripts\") pod \"cinder-db-create-shtsx\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " pod="openstack/cinder-db-create-shtsx" Mar 14 08:48:59 crc kubenswrapper[4886]: I0314 08:48:59.975694 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-combined-ca-bundle\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.055827 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-r8f78"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.057290 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.087327 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-daf3-account-create-update-zh48j"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.087343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzvj\" (UniqueName: \"kubernetes.io/projected/fb94c19d-031e-44b6-bdaa-39141d037b36-kube-api-access-4tzvj\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.087849 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99c6cc9d-015a-4e33-8ded-c912cb52dde2-operator-scripts\") pod \"cinder-db-create-shtsx\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.087891 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-combined-ca-bundle\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.087980 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24sq9\" (UniqueName: \"kubernetes.io/projected/99c6cc9d-015a-4e33-8ded-c912cb52dde2-kube-api-access-24sq9\") pod \"cinder-db-create-shtsx\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.088007 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-db-sync-config-data\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.088069 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-config-data\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.089261 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99c6cc9d-015a-4e33-8ded-c912cb52dde2-operator-scripts\") pod \"cinder-db-create-shtsx\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.091185 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.092441 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-config-data\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.094900 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-combined-ca-bundle\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.095490 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.112669 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzvj\" (UniqueName: \"kubernetes.io/projected/fb94c19d-031e-44b6-bdaa-39141d037b36-kube-api-access-4tzvj\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.116890 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-db-sync-config-data\") pod \"watcher-db-sync-jgb85\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.119927 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24sq9\" (UniqueName: \"kubernetes.io/projected/99c6cc9d-015a-4e33-8ded-c912cb52dde2-kube-api-access-24sq9\") pod \"cinder-db-create-shtsx\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.127588 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r8f78"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.138145 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-daf3-account-create-update-zh48j"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.157946 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.189155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda99471-604e-402e-b068-82d6c2269f2b-operator-scripts\") pod \"cinder-daf3-account-create-update-zh48j\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.189208 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542f6299-441b-4884-8ffb-2ea8b3c89e73-operator-scripts\") pod \"barbican-db-create-r8f78\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.189258 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnnlg\" (UniqueName: \"kubernetes.io/projected/eda99471-604e-402e-b068-82d6c2269f2b-kube-api-access-pnnlg\") pod \"cinder-daf3-account-create-update-zh48j\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.189335 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56j9\" (UniqueName: \"kubernetes.io/projected/542f6299-441b-4884-8ffb-2ea8b3c89e73-kube-api-access-t56j9\") pod \"barbican-db-create-r8f78\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.204452 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.213918 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.222404 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rc2gb"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.225275 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.227862 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.229644 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2prkb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.229794 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.230058 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.245349 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rc2gb"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.291055 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda99471-604e-402e-b068-82d6c2269f2b-operator-scripts\") pod \"cinder-daf3-account-create-update-zh48j\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.291142 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542f6299-441b-4884-8ffb-2ea8b3c89e73-operator-scripts\") pod \"barbican-db-create-r8f78\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.296897 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda99471-604e-402e-b068-82d6c2269f2b-operator-scripts\") pod \"cinder-daf3-account-create-update-zh48j\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.297534 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542f6299-441b-4884-8ffb-2ea8b3c89e73-operator-scripts\") pod \"barbican-db-create-r8f78\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.300691 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnnlg\" (UniqueName: \"kubernetes.io/projected/eda99471-604e-402e-b068-82d6c2269f2b-kube-api-access-pnnlg\") pod \"cinder-daf3-account-create-update-zh48j\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.300920 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56j9\" (UniqueName: \"kubernetes.io/projected/542f6299-441b-4884-8ffb-2ea8b3c89e73-kube-api-access-t56j9\") pod \"barbican-db-create-r8f78\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.326869 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-27cd-account-create-update-hpqtv"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.328292 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.335663 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnnlg\" (UniqueName: \"kubernetes.io/projected/eda99471-604e-402e-b068-82d6c2269f2b-kube-api-access-pnnlg\") pod \"cinder-daf3-account-create-update-zh48j\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.336030 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.345334 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ddvrp"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.347234 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.382063 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56j9\" (UniqueName: \"kubernetes.io/projected/542f6299-441b-4884-8ffb-2ea8b3c89e73-kube-api-access-t56j9\") pod \"barbican-db-create-r8f78\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.382181 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-27cd-account-create-update-hpqtv"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.390443 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ddvrp"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.402240 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2a02-account-create-update-ltf9w"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.403933 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.405025 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-config-data\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.405109 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-combined-ca-bundle\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.405151 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7nql\" (UniqueName: \"kubernetes.io/projected/9083ef8e-f321-4442-871b-c82f908bd073-kube-api-access-x7nql\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.408747 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.416466 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2a02-account-create-update-ltf9w"] Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.492495 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509324 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-config-data\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509446 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-operator-scripts\") pod \"neutron-db-create-ddvrp\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509519 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-combined-ca-bundle\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509569 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7nql\" (UniqueName: \"kubernetes.io/projected/9083ef8e-f321-4442-871b-c82f908bd073-kube-api-access-x7nql\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509658 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34853a4a-c21e-4d80-ad6a-b2af27041d14-operator-scripts\") pod \"neutron-2a02-account-create-update-ltf9w\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509700 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mfjs\" (UniqueName: \"kubernetes.io/projected/34853a4a-c21e-4d80-ad6a-b2af27041d14-kube-api-access-9mfjs\") pod \"neutron-2a02-account-create-update-ltf9w\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4vwv\" (UniqueName: \"kubernetes.io/projected/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-kube-api-access-d4vwv\") pod \"barbican-27cd-account-create-update-hpqtv\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509845 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-operator-scripts\") pod \"barbican-27cd-account-create-update-hpqtv\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.509906 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzvk\" (UniqueName: \"kubernetes.io/projected/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-kube-api-access-phzvk\") pod \"neutron-db-create-ddvrp\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.518801 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-combined-ca-bundle\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.523375 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-config-data\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.541918 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7nql\" (UniqueName: \"kubernetes.io/projected/9083ef8e-f321-4442-871b-c82f908bd073-kube-api-access-x7nql\") pod \"keystone-db-sync-rc2gb\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.611860 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34853a4a-c21e-4d80-ad6a-b2af27041d14-operator-scripts\") pod \"neutron-2a02-account-create-update-ltf9w\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.611958 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mfjs\" (UniqueName: \"kubernetes.io/projected/34853a4a-c21e-4d80-ad6a-b2af27041d14-kube-api-access-9mfjs\") pod \"neutron-2a02-account-create-update-ltf9w\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.612023 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4vwv\" (UniqueName: \"kubernetes.io/projected/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-kube-api-access-d4vwv\") pod \"barbican-27cd-account-create-update-hpqtv\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.612087 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-operator-scripts\") pod \"barbican-27cd-account-create-update-hpqtv\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.612157 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzvk\" (UniqueName: \"kubernetes.io/projected/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-kube-api-access-phzvk\") pod \"neutron-db-create-ddvrp\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.612232 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-operator-scripts\") pod \"neutron-db-create-ddvrp\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.612536 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34853a4a-c21e-4d80-ad6a-b2af27041d14-operator-scripts\") pod \"neutron-2a02-account-create-update-ltf9w\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.613366 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-operator-scripts\") pod \"barbican-27cd-account-create-update-hpqtv\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.614159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-operator-scripts\") pod \"neutron-db-create-ddvrp\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.632658 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzvk\" (UniqueName: \"kubernetes.io/projected/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-kube-api-access-phzvk\") pod \"neutron-db-create-ddvrp\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.634647 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mfjs\" (UniqueName: \"kubernetes.io/projected/34853a4a-c21e-4d80-ad6a-b2af27041d14-kube-api-access-9mfjs\") pod \"neutron-2a02-account-create-update-ltf9w\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.643486 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4vwv\" (UniqueName: \"kubernetes.io/projected/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-kube-api-access-d4vwv\") pod \"barbican-27cd-account-create-update-hpqtv\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.676779 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.793881 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.840263 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.884393 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:00 crc kubenswrapper[4886]: I0314 08:49:00.894755 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:01 crc kubenswrapper[4886]: I0314 08:49:01.587055 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:01 crc kubenswrapper[4886]: I0314 08:49:01.590728 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:01 crc kubenswrapper[4886]: I0314 08:49:01.701379 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:03 crc kubenswrapper[4886]: I0314 08:49:03.537033 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9k99l" podUID="dee9c638-1703-4b56-b366-13c6746d035c" containerName="ovn-controller" probeResult="failure" output=< Mar 14 08:49:03 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 08:49:03 crc kubenswrapper[4886]: > Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.118706 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.118929 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="prometheus" containerID="cri-o://c57bec255019de882019b2ad341a2a577a243d602db0e581ed662faf20a1d261" gracePeriod=600 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.119214 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="thanos-sidecar" containerID="cri-o://350bee66930c3c908370e7b9c5cb15c68693d80d1ebe7ee35949652765be4b21" gracePeriod=600 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.119352 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="config-reloader" containerID="cri-o://98bca5fb4b30d948c0f6bd27d2746cfb226c5c1dd57a1d7f14dae009417085bf" gracePeriod=600 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.304365 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.368607 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-qhcn4"] Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.370760 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="dnsmasq-dns" containerID="cri-o://c980931d3e428853308a510ad169da2482c95c527ce85917c4411fe99c4009d4" gracePeriod=10 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.730490 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerID="c980931d3e428853308a510ad169da2482c95c527ce85917c4411fe99c4009d4" exitCode=0 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.730566 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" event={"ID":"a1701819-f673-4757-b2f4-6a3dd4da8601","Type":"ContainerDied","Data":"c980931d3e428853308a510ad169da2482c95c527ce85917c4411fe99c4009d4"} Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.733170 4886 generic.go:334] "Generic (PLEG): container finished" podID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerID="350bee66930c3c908370e7b9c5cb15c68693d80d1ebe7ee35949652765be4b21" exitCode=0 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.733199 4886 generic.go:334] "Generic (PLEG): container finished" podID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerID="98bca5fb4b30d948c0f6bd27d2746cfb226c5c1dd57a1d7f14dae009417085bf" exitCode=0 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.733210 4886 generic.go:334] "Generic (PLEG): container finished" podID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerID="c57bec255019de882019b2ad341a2a577a243d602db0e581ed662faf20a1d261" exitCode=0 Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.733227 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerDied","Data":"350bee66930c3c908370e7b9c5cb15c68693d80d1ebe7ee35949652765be4b21"} Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.733246 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerDied","Data":"98bca5fb4b30d948c0f6bd27d2746cfb226c5c1dd57a1d7f14dae009417085bf"} Mar 14 08:49:04 crc kubenswrapper[4886]: I0314 08:49:04.733258 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerDied","Data":"c57bec255019de882019b2ad341a2a577a243d602db0e581ed662faf20a1d261"} Mar 14 08:49:06 crc kubenswrapper[4886]: I0314 08:49:06.587655 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": dial tcp 10.217.0.114:9090: connect: connection refused" Mar 14 08:49:06 crc kubenswrapper[4886]: I0314 08:49:06.979107 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.284531 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335647 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-1\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335710 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-tls-assets\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335833 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-0\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335875 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-web-config\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335909 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-thanos-prometheus-http-client-file\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335961 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqxxb\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-kube-api-access-kqxxb\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.335997 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28d6b363-8881-407e-b8e4-9fd7863b881c-config-out\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.336021 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-2\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.336056 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-config\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.336206 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"28d6b363-8881-407e-b8e4-9fd7863b881c\" (UID: \"28d6b363-8881-407e-b8e4-9fd7863b881c\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.338518 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.339241 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.339412 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.354849 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.355370 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d6b363-8881-407e-b8e4-9fd7863b881c-config-out" (OuterVolumeSpecName: "config-out") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.359479 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-config" (OuterVolumeSpecName: "config") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.359571 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-kube-api-access-kqxxb" (OuterVolumeSpecName: "kube-api-access-kqxxb") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "kube-api-access-kqxxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.370402 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.393643 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.395783 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.399668 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-web-config" (OuterVolumeSpecName: "web-config") pod "28d6b363-8881-407e-b8e4-9fd7863b881c" (UID: "28d6b363-8881-407e-b8e4-9fd7863b881c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438532 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438581 4886 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-web-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438605 4886 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438655 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqxxb\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-kube-api-access-kqxxb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438672 4886 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28d6b363-8881-407e-b8e4-9fd7863b881c-config-out\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438685 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438697 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/28d6b363-8881-407e-b8e4-9fd7863b881c-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438768 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") on node \"crc\" " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438786 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/28d6b363-8881-407e-b8e4-9fd7863b881c-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.438831 4886 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28d6b363-8881-407e-b8e4-9fd7863b881c-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.473465 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.473796 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069") on node "crc" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.542423 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-nb\") pod \"a1701819-f673-4757-b2f4-6a3dd4da8601\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.542525 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-789k4\" (UniqueName: \"kubernetes.io/projected/a1701819-f673-4757-b2f4-6a3dd4da8601-kube-api-access-789k4\") pod \"a1701819-f673-4757-b2f4-6a3dd4da8601\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.542560 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-sb\") pod \"a1701819-f673-4757-b2f4-6a3dd4da8601\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.542630 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-config\") pod \"a1701819-f673-4757-b2f4-6a3dd4da8601\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.542680 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-dns-svc\") pod \"a1701819-f673-4757-b2f4-6a3dd4da8601\" (UID: \"a1701819-f673-4757-b2f4-6a3dd4da8601\") " Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.543233 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.561986 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1701819-f673-4757-b2f4-6a3dd4da8601-kube-api-access-789k4" (OuterVolumeSpecName: "kube-api-access-789k4") pod "a1701819-f673-4757-b2f4-6a3dd4da8601" (UID: "a1701819-f673-4757-b2f4-6a3dd4da8601"). InnerVolumeSpecName "kube-api-access-789k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.600263 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-config" (OuterVolumeSpecName: "config") pod "a1701819-f673-4757-b2f4-6a3dd4da8601" (UID: "a1701819-f673-4757-b2f4-6a3dd4da8601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.617313 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1701819-f673-4757-b2f4-6a3dd4da8601" (UID: "a1701819-f673-4757-b2f4-6a3dd4da8601"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.631871 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1701819-f673-4757-b2f4-6a3dd4da8601" (UID: "a1701819-f673-4757-b2f4-6a3dd4da8601"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.634578 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1701819-f673-4757-b2f4-6a3dd4da8601" (UID: "a1701819-f673-4757-b2f4-6a3dd4da8601"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.649239 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.649273 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.649282 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.649319 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1701819-f673-4757-b2f4-6a3dd4da8601-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.649331 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-789k4\" (UniqueName: \"kubernetes.io/projected/a1701819-f673-4757-b2f4-6a3dd4da8601-kube-api-access-789k4\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.759673 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" event={"ID":"a1701819-f673-4757-b2f4-6a3dd4da8601","Type":"ContainerDied","Data":"d17ace1c14e706c11391930423edd6a53bc64690c25b53eaa651ae1b4c28d9c2"} Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.759733 4886 scope.go:117] "RemoveContainer" containerID="c980931d3e428853308a510ad169da2482c95c527ce85917c4411fe99c4009d4" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.759748 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-qhcn4" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.781671 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"28d6b363-8881-407e-b8e4-9fd7863b881c","Type":"ContainerDied","Data":"c927307ec66c3e694d4180edc82b0b3f71713f23eacb7292f45314299d13c6a8"} Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.781815 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.798730 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-qhcn4"] Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.801430 4886 scope.go:117] "RemoveContainer" containerID="557aacaff1b2b6a25e5b9b1cb78997e77ac13a2ce1f69a6ae6536716e5a8ddc6" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.806831 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-qhcn4"] Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.827286 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.846375 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879239 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:49:07 crc kubenswrapper[4886]: E0314 08:49:07.879687 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="thanos-sidecar" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879700 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="thanos-sidecar" Mar 14 08:49:07 crc kubenswrapper[4886]: E0314 08:49:07.879713 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="dnsmasq-dns" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879719 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="dnsmasq-dns" Mar 14 08:49:07 crc kubenswrapper[4886]: E0314 08:49:07.879733 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="init-config-reloader" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879740 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="init-config-reloader" Mar 14 08:49:07 crc kubenswrapper[4886]: E0314 08:49:07.879752 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="init" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879758 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="init" Mar 14 08:49:07 crc kubenswrapper[4886]: E0314 08:49:07.879768 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="config-reloader" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879774 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="config-reloader" Mar 14 08:49:07 crc kubenswrapper[4886]: E0314 08:49:07.879784 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="prometheus" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879790 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="prometheus" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879951 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="thanos-sidecar" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879969 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="config-reloader" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879976 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" containerName="dnsmasq-dns" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.879989 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" containerName="prometheus" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.881557 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.890290 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.891423 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.891870 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.896625 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.896868 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.897684 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.897866 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.898028 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.898303 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.898484 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nz9cv" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.930378 4886 scope.go:117] "RemoveContainer" containerID="350bee66930c3c908370e7b9c5cb15c68693d80d1ebe7ee35949652765be4b21" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960375 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960430 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960450 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960490 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960507 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960522 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6tkf\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-kube-api-access-n6tkf\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960540 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960556 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960577 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960599 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960621 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960651 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.960669 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:07 crc kubenswrapper[4886]: I0314 08:49:07.976177 4886 scope.go:117] "RemoveContainer" containerID="98bca5fb4b30d948c0f6bd27d2746cfb226c5c1dd57a1d7f14dae009417085bf" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.005305 4886 scope.go:117] "RemoveContainer" containerID="c57bec255019de882019b2ad341a2a577a243d602db0e581ed662faf20a1d261" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.035661 4886 scope.go:117] "RemoveContainer" containerID="29c601a3c13dda1e64fd2afeda8a906573f80921a568fd811185212a87cc6028" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.063993 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064059 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064079 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064120 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064163 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064182 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6tkf\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-kube-api-access-n6tkf\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064200 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064218 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064241 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064291 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064325 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.064343 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.070743 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.074965 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.075513 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.075927 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.081095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.082813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.083005 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.083953 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.084557 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.084617 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/915b9721c137ba3e3acc5e7d0fcf048ab5161bf9eea8563b49a63a650ee09ff7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.086425 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.090942 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.091962 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.098623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6tkf\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-kube-api-access-n6tkf\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.186222 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.233611 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9k99l-config-5cl58"] Mar 14 08:49:08 crc kubenswrapper[4886]: W0314 08:49:08.250872 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod542f6299_441b_4884_8ffb_2ea8b3c89e73.slice/crio-e0ea0cac9e9c5bbbe071148e4af016f107cc528aa50da233fb73c25866692e15 WatchSource:0}: Error finding container e0ea0cac9e9c5bbbe071148e4af016f107cc528aa50da233fb73c25866692e15: Status 404 returned error can't find the container with id e0ea0cac9e9c5bbbe071148e4af016f107cc528aa50da233fb73c25866692e15 Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.254914 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-27cd-account-create-update-hpqtv"] Mar 14 08:49:08 crc kubenswrapper[4886]: W0314 08:49:08.260843 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb94c19d_031e_44b6_bdaa_39141d037b36.slice/crio-48700777e7fe3282706a49000785bf06acf488c5369aaf2b662b1e1364001082 WatchSource:0}: Error finding container 48700777e7fe3282706a49000785bf06acf488c5369aaf2b662b1e1364001082: Status 404 returned error can't find the container with id 48700777e7fe3282706a49000785bf06acf488c5369aaf2b662b1e1364001082 Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.270430 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.272740 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-daf3-account-create-update-zh48j"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.290726 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ddvrp"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.295008 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-shtsx"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.303353 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2a02-account-create-update-ltf9w"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.315417 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r8f78"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.331959 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-jgb85"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.341728 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rc2gb"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.601941 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9k99l" podUID="dee9c638-1703-4b56-b366-13c6746d035c" containerName="ovn-controller" probeResult="failure" output=< Mar 14 08:49:08 crc kubenswrapper[4886]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 14 08:49:08 crc kubenswrapper[4886]: > Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.824339 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ddvrp" event={"ID":"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5","Type":"ContainerStarted","Data":"b02ab9aa84b747b709acf832297faf86b3fe7352405d64c6d17b0e2993f27111"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.824394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ddvrp" event={"ID":"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5","Type":"ContainerStarted","Data":"f269438784c49f21110e9b72fbcde833b39094fb85c7a4323dde178e9d8131bf"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.827829 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8f78" event={"ID":"542f6299-441b-4884-8ffb-2ea8b3c89e73","Type":"ContainerStarted","Data":"d32413d0105f04819935ec902c232e262b47516499fbec9509d64b3f38ea0ff8"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.827864 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8f78" event={"ID":"542f6299-441b-4884-8ffb-2ea8b3c89e73","Type":"ContainerStarted","Data":"e0ea0cac9e9c5bbbe071148e4af016f107cc528aa50da233fb73c25866692e15"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.840117 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-ddvrp" podStartSLOduration=8.840101751 podStartE2EDuration="8.840101751s" podCreationTimestamp="2026-03-14 08:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:08.838487706 +0000 UTC m=+1284.086939343" watchObservedRunningTime="2026-03-14 08:49:08.840101751 +0000 UTC m=+1284.088553388" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.841265 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jgb85" event={"ID":"fb94c19d-031e-44b6-bdaa-39141d037b36","Type":"ContainerStarted","Data":"48700777e7fe3282706a49000785bf06acf488c5369aaf2b662b1e1364001082"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.861610 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2a02-account-create-update-ltf9w" event={"ID":"34853a4a-c21e-4d80-ad6a-b2af27041d14","Type":"ContainerStarted","Data":"dd729342a3a7ad6f9543afe850b665d6ec185d8c478f4e2acbf0738c9d477c7c"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.861660 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2a02-account-create-update-ltf9w" event={"ID":"34853a4a-c21e-4d80-ad6a-b2af27041d14","Type":"ContainerStarted","Data":"8302490805a6a59f0e0ef7fb603bafd94b348b7b90eecf29eb47d4096197ff78"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.863742 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l-config-5cl58" event={"ID":"8c653576-755d-4704-a875-5782eab04795","Type":"ContainerStarted","Data":"267ed332e16b76f6fcadf318b9fcbebe02f47e86bbf44b87f0a79db7bf520744"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.865668 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-r8f78" podStartSLOduration=8.865646211 podStartE2EDuration="8.865646211s" podCreationTimestamp="2026-03-14 08:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:08.854927003 +0000 UTC m=+1284.103378640" watchObservedRunningTime="2026-03-14 08:49:08.865646211 +0000 UTC m=+1284.114097848" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.865817 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rc2gb" event={"ID":"9083ef8e-f321-4442-871b-c82f908bd073","Type":"ContainerStarted","Data":"a7d4d09edb457682720b0f611fbec19b50efbc7d37f946b3f3922d5cb0abbcec"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.884862 4886 generic.go:334] "Generic (PLEG): container finished" podID="99c6cc9d-015a-4e33-8ded-c912cb52dde2" containerID="3a8dbbf717b472cf7fbeafd77663058b0413e2d4a420e4b1b2049a6a67ccbd8d" exitCode=0 Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.885038 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-shtsx" event={"ID":"99c6cc9d-015a-4e33-8ded-c912cb52dde2","Type":"ContainerDied","Data":"3a8dbbf717b472cf7fbeafd77663058b0413e2d4a420e4b1b2049a6a67ccbd8d"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.885069 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-shtsx" event={"ID":"99c6cc9d-015a-4e33-8ded-c912cb52dde2","Type":"ContainerStarted","Data":"d1efb40c193bf2b8c8a520c110f9f477a2b58232723dde9601bcb21e6a9473d9"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.904187 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-2a02-account-create-update-ltf9w" podStartSLOduration=8.904169161 podStartE2EDuration="8.904169161s" podCreationTimestamp="2026-03-14 08:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:08.880066121 +0000 UTC m=+1284.128517778" watchObservedRunningTime="2026-03-14 08:49:08.904169161 +0000 UTC m=+1284.152620798" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.907947 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-daf3-account-create-update-zh48j" event={"ID":"eda99471-604e-402e-b068-82d6c2269f2b","Type":"ContainerStarted","Data":"ff4d8d568b7cf3b962d01a45193d39fa29d6735102596fcc22da158ec0ffdd03"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.908376 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-daf3-account-create-update-zh48j" event={"ID":"eda99471-604e-402e-b068-82d6c2269f2b","Type":"ContainerStarted","Data":"f1c037fcee08d772cee32d50901221aa6b80a9e4c2166a712ba2227d1a43bedb"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.921965 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9k99l-config-5cl58" podStartSLOduration=10.921946075 podStartE2EDuration="10.921946075s" podCreationTimestamp="2026-03-14 08:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:08.903928404 +0000 UTC m=+1284.152380041" watchObservedRunningTime="2026-03-14 08:49:08.921946075 +0000 UTC m=+1284.170397712" Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.936932 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w889t" event={"ID":"2be4cce3-ae51-4d07-a9d9-ccc6152774b5","Type":"ContainerStarted","Data":"5eb01f83f18094ce0660e54bf4941eb3644f84e36a881c2d3515923b80734bef"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.954944 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27cd-account-create-update-hpqtv" event={"ID":"f59672dc-f49b-4bd3-aa93-f0161ac73cfd","Type":"ContainerStarted","Data":"5559e862a0e46514ce1527a1f0373e06468fa73279022e23a727ac1ba07c5de3"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.954986 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27cd-account-create-update-hpqtv" event={"ID":"f59672dc-f49b-4bd3-aa93-f0161ac73cfd","Type":"ContainerStarted","Data":"04062710dabb844b73416be8062a979c36869ec7ddc66ac55210773367e55218"} Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.972825 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 08:49:08 crc kubenswrapper[4886]: I0314 08:49:08.982283 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w889t" podStartSLOduration=3.034659218 podStartE2EDuration="15.982260551s" podCreationTimestamp="2026-03-14 08:48:53 +0000 UTC" firstStartedPulling="2026-03-14 08:48:54.213784237 +0000 UTC m=+1269.462235874" lastFinishedPulling="2026-03-14 08:49:07.16138557 +0000 UTC m=+1282.409837207" observedRunningTime="2026-03-14 08:49:08.969089505 +0000 UTC m=+1284.217541142" watchObservedRunningTime="2026-03-14 08:49:08.982260551 +0000 UTC m=+1284.230712188" Mar 14 08:49:09 crc kubenswrapper[4886]: W0314 08:49:09.024159 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf4e51e9_c5ec_41ee_83f5_3b031c20c877.slice/crio-ae2fe54ce8b01bf34b65779aef018ce89a7d8c3f33690c1e8a216d329447c737 WatchSource:0}: Error finding container ae2fe54ce8b01bf34b65779aef018ce89a7d8c3f33690c1e8a216d329447c737: Status 404 returned error can't find the container with id ae2fe54ce8b01bf34b65779aef018ce89a7d8c3f33690c1e8a216d329447c737 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.434907 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d6b363-8881-407e-b8e4-9fd7863b881c" path="/var/lib/kubelet/pods/28d6b363-8881-407e-b8e4-9fd7863b881c/volumes" Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.436172 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1701819-f673-4757-b2f4-6a3dd4da8601" path="/var/lib/kubelet/pods/a1701819-f673-4757-b2f4-6a3dd4da8601/volumes" Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.971418 4886 generic.go:334] "Generic (PLEG): container finished" podID="8c653576-755d-4704-a875-5782eab04795" containerID="d7ff59454da068f6047f5940d664fe653fd467819905375506175e119ee424df" exitCode=0 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.971794 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l-config-5cl58" event={"ID":"8c653576-755d-4704-a875-5782eab04795","Type":"ContainerDied","Data":"d7ff59454da068f6047f5940d664fe653fd467819905375506175e119ee424df"} Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.975239 4886 generic.go:334] "Generic (PLEG): container finished" podID="3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" containerID="b02ab9aa84b747b709acf832297faf86b3fe7352405d64c6d17b0e2993f27111" exitCode=0 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.975307 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ddvrp" event={"ID":"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5","Type":"ContainerDied","Data":"b02ab9aa84b747b709acf832297faf86b3fe7352405d64c6d17b0e2993f27111"} Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.979621 4886 generic.go:334] "Generic (PLEG): container finished" podID="542f6299-441b-4884-8ffb-2ea8b3c89e73" containerID="d32413d0105f04819935ec902c232e262b47516499fbec9509d64b3f38ea0ff8" exitCode=0 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.979768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8f78" event={"ID":"542f6299-441b-4884-8ffb-2ea8b3c89e73","Type":"ContainerDied","Data":"d32413d0105f04819935ec902c232e262b47516499fbec9509d64b3f38ea0ff8"} Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.981728 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerStarted","Data":"ae2fe54ce8b01bf34b65779aef018ce89a7d8c3f33690c1e8a216d329447c737"} Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.991524 4886 generic.go:334] "Generic (PLEG): container finished" podID="eda99471-604e-402e-b068-82d6c2269f2b" containerID="ff4d8d568b7cf3b962d01a45193d39fa29d6735102596fcc22da158ec0ffdd03" exitCode=0 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.991609 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-daf3-account-create-update-zh48j" event={"ID":"eda99471-604e-402e-b068-82d6c2269f2b","Type":"ContainerDied","Data":"ff4d8d568b7cf3b962d01a45193d39fa29d6735102596fcc22da158ec0ffdd03"} Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.995521 4886 generic.go:334] "Generic (PLEG): container finished" podID="f59672dc-f49b-4bd3-aa93-f0161ac73cfd" containerID="5559e862a0e46514ce1527a1f0373e06468fa73279022e23a727ac1ba07c5de3" exitCode=0 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.995603 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27cd-account-create-update-hpqtv" event={"ID":"f59672dc-f49b-4bd3-aa93-f0161ac73cfd","Type":"ContainerDied","Data":"5559e862a0e46514ce1527a1f0373e06468fa73279022e23a727ac1ba07c5de3"} Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.998632 4886 generic.go:334] "Generic (PLEG): container finished" podID="34853a4a-c21e-4d80-ad6a-b2af27041d14" containerID="dd729342a3a7ad6f9543afe850b665d6ec185d8c478f4e2acbf0738c9d477c7c" exitCode=0 Mar 14 08:49:09 crc kubenswrapper[4886]: I0314 08:49:09.998692 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2a02-account-create-update-ltf9w" event={"ID":"34853a4a-c21e-4d80-ad6a-b2af27041d14","Type":"ContainerDied","Data":"dd729342a3a7ad6f9543afe850b665d6ec185d8c478f4e2acbf0738c9d477c7c"} Mar 14 08:49:10 crc kubenswrapper[4886]: E0314 08:49:10.208333 4886 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 14 08:49:10 crc kubenswrapper[4886]: I0314 08:49:10.375342 4886 scope.go:117] "RemoveContainer" containerID="214f7f9e287f12c439f337966b7749b18b9e2881205eb7dfcab7487ad391bde6" Mar 14 08:49:13 crc kubenswrapper[4886]: I0314 08:49:13.037939 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerStarted","Data":"ca53866a18c29a479218672cd44e44157f683bc6a1b3b954cf62e976e738cfd3"} Mar 14 08:49:13 crc kubenswrapper[4886]: I0314 08:49:13.521509 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9k99l" Mar 14 08:49:18 crc kubenswrapper[4886]: I0314 08:49:18.096102 4886 generic.go:334] "Generic (PLEG): container finished" podID="2be4cce3-ae51-4d07-a9d9-ccc6152774b5" containerID="5eb01f83f18094ce0660e54bf4941eb3644f84e36a881c2d3515923b80734bef" exitCode=0 Mar 14 08:49:18 crc kubenswrapper[4886]: I0314 08:49:18.096183 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w889t" event={"ID":"2be4cce3-ae51-4d07-a9d9-ccc6152774b5","Type":"ContainerDied","Data":"5eb01f83f18094ce0660e54bf4941eb3644f84e36a881c2d3515923b80734bef"} Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.853176 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.861099 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.873466 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.890822 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.904073 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.905469 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:49:19 crc kubenswrapper[4886]: I0314 08:49:19.917762 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.016877 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhcpx\" (UniqueName: \"kubernetes.io/projected/8c653576-755d-4704-a875-5782eab04795-kube-api-access-bhcpx\") pod \"8c653576-755d-4704-a875-5782eab04795\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.016923 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phzvk\" (UniqueName: \"kubernetes.io/projected/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-kube-api-access-phzvk\") pod \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.016949 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34853a4a-c21e-4d80-ad6a-b2af27041d14-operator-scripts\") pod \"34853a4a-c21e-4d80-ad6a-b2af27041d14\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.016968 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mfjs\" (UniqueName: \"kubernetes.io/projected/34853a4a-c21e-4d80-ad6a-b2af27041d14-kube-api-access-9mfjs\") pod \"34853a4a-c21e-4d80-ad6a-b2af27041d14\" (UID: \"34853a4a-c21e-4d80-ad6a-b2af27041d14\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.016991 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run-ovn\") pod \"8c653576-755d-4704-a875-5782eab04795\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017014 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-additional-scripts\") pod \"8c653576-755d-4704-a875-5782eab04795\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017043 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99c6cc9d-015a-4e33-8ded-c912cb52dde2-operator-scripts\") pod \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017076 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run\") pod \"8c653576-755d-4704-a875-5782eab04795\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017140 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24sq9\" (UniqueName: \"kubernetes.io/projected/99c6cc9d-015a-4e33-8ded-c912cb52dde2-kube-api-access-24sq9\") pod \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\" (UID: \"99c6cc9d-015a-4e33-8ded-c912cb52dde2\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017163 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda99471-604e-402e-b068-82d6c2269f2b-operator-scripts\") pod \"eda99471-604e-402e-b068-82d6c2269f2b\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017194 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-log-ovn\") pod \"8c653576-755d-4704-a875-5782eab04795\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017210 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnnlg\" (UniqueName: \"kubernetes.io/projected/eda99471-604e-402e-b068-82d6c2269f2b-kube-api-access-pnnlg\") pod \"eda99471-604e-402e-b068-82d6c2269f2b\" (UID: \"eda99471-604e-402e-b068-82d6c2269f2b\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017240 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-scripts\") pod \"8c653576-755d-4704-a875-5782eab04795\" (UID: \"8c653576-755d-4704-a875-5782eab04795\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017293 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-operator-scripts\") pod \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\" (UID: \"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017319 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4vwv\" (UniqueName: \"kubernetes.io/projected/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-kube-api-access-d4vwv\") pod \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017335 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-operator-scripts\") pod \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\" (UID: \"f59672dc-f49b-4bd3-aa93-f0161ac73cfd\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.017942 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8c653576-755d-4704-a875-5782eab04795" (UID: "8c653576-755d-4704-a875-5782eab04795"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.018014 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8c653576-755d-4704-a875-5782eab04795" (UID: "8c653576-755d-4704-a875-5782eab04795"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.018264 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f59672dc-f49b-4bd3-aa93-f0161ac73cfd" (UID: "f59672dc-f49b-4bd3-aa93-f0161ac73cfd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.018340 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" (UID: "3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.018393 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34853a4a-c21e-4d80-ad6a-b2af27041d14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34853a4a-c21e-4d80-ad6a-b2af27041d14" (UID: "34853a4a-c21e-4d80-ad6a-b2af27041d14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.018970 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda99471-604e-402e-b068-82d6c2269f2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eda99471-604e-402e-b068-82d6c2269f2b" (UID: "eda99471-604e-402e-b068-82d6c2269f2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.019246 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run" (OuterVolumeSpecName: "var-run") pod "8c653576-755d-4704-a875-5782eab04795" (UID: "8c653576-755d-4704-a875-5782eab04795"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.019483 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c6cc9d-015a-4e33-8ded-c912cb52dde2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99c6cc9d-015a-4e33-8ded-c912cb52dde2" (UID: "99c6cc9d-015a-4e33-8ded-c912cb52dde2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.019614 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8c653576-755d-4704-a875-5782eab04795" (UID: "8c653576-755d-4704-a875-5782eab04795"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.019777 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-scripts" (OuterVolumeSpecName: "scripts") pod "8c653576-755d-4704-a875-5782eab04795" (UID: "8c653576-755d-4704-a875-5782eab04795"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.023273 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda99471-604e-402e-b068-82d6c2269f2b-kube-api-access-pnnlg" (OuterVolumeSpecName: "kube-api-access-pnnlg") pod "eda99471-604e-402e-b068-82d6c2269f2b" (UID: "eda99471-604e-402e-b068-82d6c2269f2b"). InnerVolumeSpecName "kube-api-access-pnnlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.023557 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c653576-755d-4704-a875-5782eab04795-kube-api-access-bhcpx" (OuterVolumeSpecName: "kube-api-access-bhcpx") pod "8c653576-755d-4704-a875-5782eab04795" (UID: "8c653576-755d-4704-a875-5782eab04795"). InnerVolumeSpecName "kube-api-access-bhcpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.023708 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-kube-api-access-d4vwv" (OuterVolumeSpecName: "kube-api-access-d4vwv") pod "f59672dc-f49b-4bd3-aa93-f0161ac73cfd" (UID: "f59672dc-f49b-4bd3-aa93-f0161ac73cfd"). InnerVolumeSpecName "kube-api-access-d4vwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.023934 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-kube-api-access-phzvk" (OuterVolumeSpecName: "kube-api-access-phzvk") pod "3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" (UID: "3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5"). InnerVolumeSpecName "kube-api-access-phzvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.024048 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c6cc9d-015a-4e33-8ded-c912cb52dde2-kube-api-access-24sq9" (OuterVolumeSpecName: "kube-api-access-24sq9") pod "99c6cc9d-015a-4e33-8ded-c912cb52dde2" (UID: "99c6cc9d-015a-4e33-8ded-c912cb52dde2"). InnerVolumeSpecName "kube-api-access-24sq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.024239 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34853a4a-c21e-4d80-ad6a-b2af27041d14-kube-api-access-9mfjs" (OuterVolumeSpecName: "kube-api-access-9mfjs") pod "34853a4a-c21e-4d80-ad6a-b2af27041d14" (UID: "34853a4a-c21e-4d80-ad6a-b2af27041d14"). InnerVolumeSpecName "kube-api-access-9mfjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.117022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ddvrp" event={"ID":"3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5","Type":"ContainerDied","Data":"f269438784c49f21110e9b72fbcde833b39094fb85c7a4323dde178e9d8131bf"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.117062 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f269438784c49f21110e9b72fbcde833b39094fb85c7a4323dde178e9d8131bf" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.117062 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ddvrp" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.118531 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542f6299-441b-4884-8ffb-2ea8b3c89e73-operator-scripts\") pod \"542f6299-441b-4884-8ffb-2ea8b3c89e73\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.118686 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t56j9\" (UniqueName: \"kubernetes.io/projected/542f6299-441b-4884-8ffb-2ea8b3c89e73-kube-api-access-t56j9\") pod \"542f6299-441b-4884-8ffb-2ea8b3c89e73\" (UID: \"542f6299-441b-4884-8ffb-2ea8b3c89e73\") " Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119115 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhcpx\" (UniqueName: \"kubernetes.io/projected/8c653576-755d-4704-a875-5782eab04795-kube-api-access-bhcpx\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119139 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phzvk\" (UniqueName: \"kubernetes.io/projected/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-kube-api-access-phzvk\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119150 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34853a4a-c21e-4d80-ad6a-b2af27041d14-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119159 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mfjs\" (UniqueName: \"kubernetes.io/projected/34853a4a-c21e-4d80-ad6a-b2af27041d14-kube-api-access-9mfjs\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119169 4886 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119177 4886 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119186 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99c6cc9d-015a-4e33-8ded-c912cb52dde2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119194 4886 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119204 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24sq9\" (UniqueName: \"kubernetes.io/projected/99c6cc9d-015a-4e33-8ded-c912cb52dde2-kube-api-access-24sq9\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119212 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda99471-604e-402e-b068-82d6c2269f2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119220 4886 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8c653576-755d-4704-a875-5782eab04795-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119230 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnnlg\" (UniqueName: \"kubernetes.io/projected/eda99471-604e-402e-b068-82d6c2269f2b-kube-api-access-pnnlg\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119238 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c653576-755d-4704-a875-5782eab04795-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119246 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119255 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4vwv\" (UniqueName: \"kubernetes.io/projected/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-kube-api-access-d4vwv\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119263 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f59672dc-f49b-4bd3-aa93-f0161ac73cfd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.119326 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/542f6299-441b-4884-8ffb-2ea8b3c89e73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "542f6299-441b-4884-8ffb-2ea8b3c89e73" (UID: "542f6299-441b-4884-8ffb-2ea8b3c89e73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.120952 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-shtsx" event={"ID":"99c6cc9d-015a-4e33-8ded-c912cb52dde2","Type":"ContainerDied","Data":"d1efb40c193bf2b8c8a520c110f9f477a2b58232723dde9601bcb21e6a9473d9"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.120986 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1efb40c193bf2b8c8a520c110f9f477a2b58232723dde9601bcb21e6a9473d9" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.120987 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-shtsx" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.122348 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8f78" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.122358 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8f78" event={"ID":"542f6299-441b-4884-8ffb-2ea8b3c89e73","Type":"ContainerDied","Data":"e0ea0cac9e9c5bbbe071148e4af016f107cc528aa50da233fb73c25866692e15"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.122393 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ea0cac9e9c5bbbe071148e4af016f107cc528aa50da233fb73c25866692e15" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.125143 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerID="ca53866a18c29a479218672cd44e44157f683bc6a1b3b954cf62e976e738cfd3" exitCode=0 Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.125193 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerDied","Data":"ca53866a18c29a479218672cd44e44157f683bc6a1b3b954cf62e976e738cfd3"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.128429 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-daf3-account-create-update-zh48j" event={"ID":"eda99471-604e-402e-b068-82d6c2269f2b","Type":"ContainerDied","Data":"f1c037fcee08d772cee32d50901221aa6b80a9e4c2166a712ba2227d1a43bedb"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.128452 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c037fcee08d772cee32d50901221aa6b80a9e4c2166a712ba2227d1a43bedb" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.128506 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-daf3-account-create-update-zh48j" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.130774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542f6299-441b-4884-8ffb-2ea8b3c89e73-kube-api-access-t56j9" (OuterVolumeSpecName: "kube-api-access-t56j9") pod "542f6299-441b-4884-8ffb-2ea8b3c89e73" (UID: "542f6299-441b-4884-8ffb-2ea8b3c89e73"). InnerVolumeSpecName "kube-api-access-t56j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.134687 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27cd-account-create-update-hpqtv" event={"ID":"f59672dc-f49b-4bd3-aa93-f0161ac73cfd","Type":"ContainerDied","Data":"04062710dabb844b73416be8062a979c36869ec7ddc66ac55210773367e55218"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.134725 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04062710dabb844b73416be8062a979c36869ec7ddc66ac55210773367e55218" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.134798 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27cd-account-create-update-hpqtv" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.139054 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2a02-account-create-update-ltf9w" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.141167 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2a02-account-create-update-ltf9w" event={"ID":"34853a4a-c21e-4d80-ad6a-b2af27041d14","Type":"ContainerDied","Data":"8302490805a6a59f0e0ef7fb603bafd94b348b7b90eecf29eb47d4096197ff78"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.141192 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8302490805a6a59f0e0ef7fb603bafd94b348b7b90eecf29eb47d4096197ff78" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.142100 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l-config-5cl58" event={"ID":"8c653576-755d-4704-a875-5782eab04795","Type":"ContainerDied","Data":"267ed332e16b76f6fcadf318b9fcbebe02f47e86bbf44b87f0a79db7bf520744"} Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.142117 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="267ed332e16b76f6fcadf318b9fcbebe02f47e86bbf44b87f0a79db7bf520744" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.142169 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-5cl58" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.220709 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542f6299-441b-4884-8ffb-2ea8b3c89e73-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:20 crc kubenswrapper[4886]: I0314 08:49:20.220741 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t56j9\" (UniqueName: \"kubernetes.io/projected/542f6299-441b-4884-8ffb-2ea8b3c89e73-kube-api-access-t56j9\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.076260 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9k99l-config-5cl58"] Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.086567 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9k99l-config-5cl58"] Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.125849 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w889t" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.181256 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w889t" event={"ID":"2be4cce3-ae51-4d07-a9d9-ccc6152774b5","Type":"ContainerDied","Data":"84857b6315310ce837133c33b1f54e9ea50abed181d62ac2452c687e835ceb4e"} Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.181304 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84857b6315310ce837133c33b1f54e9ea50abed181d62ac2452c687e835ceb4e" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.181375 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w889t" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221079 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9k99l-config-2kt44"] Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221431 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542f6299-441b-4884-8ffb-2ea8b3c89e73" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221448 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="542f6299-441b-4884-8ffb-2ea8b3c89e73" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221456 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59672dc-f49b-4bd3-aa93-f0161ac73cfd" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221462 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59672dc-f49b-4bd3-aa93-f0161ac73cfd" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221473 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c6cc9d-015a-4e33-8ded-c912cb52dde2" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221479 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c6cc9d-015a-4e33-8ded-c912cb52dde2" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221492 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221498 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221507 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34853a4a-c21e-4d80-ad6a-b2af27041d14" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221513 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="34853a4a-c21e-4d80-ad6a-b2af27041d14" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221526 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be4cce3-ae51-4d07-a9d9-ccc6152774b5" containerName="glance-db-sync" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221531 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be4cce3-ae51-4d07-a9d9-ccc6152774b5" containerName="glance-db-sync" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221544 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c653576-755d-4704-a875-5782eab04795" containerName="ovn-config" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221550 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c653576-755d-4704-a875-5782eab04795" containerName="ovn-config" Mar 14 08:49:21 crc kubenswrapper[4886]: E0314 08:49:21.221557 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda99471-604e-402e-b068-82d6c2269f2b" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221563 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda99471-604e-402e-b068-82d6c2269f2b" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221719 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be4cce3-ae51-4d07-a9d9-ccc6152774b5" containerName="glance-db-sync" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221730 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c6cc9d-015a-4e33-8ded-c912cb52dde2" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221741 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c653576-755d-4704-a875-5782eab04795" containerName="ovn-config" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221754 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221762 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59672dc-f49b-4bd3-aa93-f0161ac73cfd" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221770 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda99471-604e-402e-b068-82d6c2269f2b" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221777 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="542f6299-441b-4884-8ffb-2ea8b3c89e73" containerName="mariadb-database-create" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.221788 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="34853a4a-c21e-4d80-ad6a-b2af27041d14" containerName="mariadb-account-create-update" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.222361 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.226085 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.239735 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9k99l-config-2kt44"] Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252223 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qd56\" (UniqueName: \"kubernetes.io/projected/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-kube-api-access-8qd56\") pod \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252263 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-db-sync-config-data\") pod \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252317 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-config-data\") pod \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252444 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-combined-ca-bundle\") pod \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\" (UID: \"2be4cce3-ae51-4d07-a9d9-ccc6152774b5\") " Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252592 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-additional-scripts\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252630 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-log-ovn\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252702 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252753 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wcbw\" (UniqueName: \"kubernetes.io/projected/755aaf87-8a64-479b-8a25-c16584e2a327-kube-api-access-5wcbw\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252821 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run-ovn\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.252850 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-scripts\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.259423 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-kube-api-access-8qd56" (OuterVolumeSpecName: "kube-api-access-8qd56") pod "2be4cce3-ae51-4d07-a9d9-ccc6152774b5" (UID: "2be4cce3-ae51-4d07-a9d9-ccc6152774b5"). InnerVolumeSpecName "kube-api-access-8qd56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.259632 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2be4cce3-ae51-4d07-a9d9-ccc6152774b5" (UID: "2be4cce3-ae51-4d07-a9d9-ccc6152774b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.291316 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2be4cce3-ae51-4d07-a9d9-ccc6152774b5" (UID: "2be4cce3-ae51-4d07-a9d9-ccc6152774b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353722 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353785 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wcbw\" (UniqueName: \"kubernetes.io/projected/755aaf87-8a64-479b-8a25-c16584e2a327-kube-api-access-5wcbw\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353837 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run-ovn\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353861 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-scripts\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353891 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-additional-scripts\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353916 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-log-ovn\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353976 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353987 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qd56\" (UniqueName: \"kubernetes.io/projected/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-kube-api-access-8qd56\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.353997 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.354261 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-log-ovn\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.354328 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run-ovn\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.354424 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.355189 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-additional-scripts\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.355976 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-scripts\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.377660 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wcbw\" (UniqueName: \"kubernetes.io/projected/755aaf87-8a64-479b-8a25-c16584e2a327-kube-api-access-5wcbw\") pod \"ovn-controller-9k99l-config-2kt44\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.379742 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-config-data" (OuterVolumeSpecName: "config-data") pod "2be4cce3-ae51-4d07-a9d9-ccc6152774b5" (UID: "2be4cce3-ae51-4d07-a9d9-ccc6152774b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.435204 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c653576-755d-4704-a875-5782eab04795" path="/var/lib/kubelet/pods/8c653576-755d-4704-a875-5782eab04795/volumes" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.455603 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2be4cce3-ae51-4d07-a9d9-ccc6152774b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:21 crc kubenswrapper[4886]: I0314 08:49:21.540295 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.020246 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9k99l-config-2kt44"] Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.193461 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jgb85" event={"ID":"fb94c19d-031e-44b6-bdaa-39141d037b36","Type":"ContainerStarted","Data":"1e47481845c53f199577f7274a38447001a233e5b7cc38957657c9fb9acfa05a"} Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.198313 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rc2gb" event={"ID":"9083ef8e-f321-4442-871b-c82f908bd073","Type":"ContainerStarted","Data":"3ba7363b5c1dda9754d6898345afb2bb662e9ecad34ee2f614d53bddb412141e"} Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.199625 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l-config-2kt44" event={"ID":"755aaf87-8a64-479b-8a25-c16584e2a327","Type":"ContainerStarted","Data":"61f237df9bc0ccbb244d59ab222269d74712d05b229eee1ffe73608d86e744ad"} Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.202662 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerStarted","Data":"38573258377a509e0a387421c64b2d3afe8291ce8182cc5132e20d24ba301499"} Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.225261 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-jgb85" podStartSLOduration=10.657437179 podStartE2EDuration="23.22524502s" podCreationTimestamp="2026-03-14 08:48:59 +0000 UTC" firstStartedPulling="2026-03-14 08:49:08.275944027 +0000 UTC m=+1283.524395664" lastFinishedPulling="2026-03-14 08:49:20.843751868 +0000 UTC m=+1296.092203505" observedRunningTime="2026-03-14 08:49:22.210153151 +0000 UTC m=+1297.458604818" watchObservedRunningTime="2026-03-14 08:49:22.22524502 +0000 UTC m=+1297.473696657" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.236692 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rc2gb" podStartSLOduration=9.720004797 podStartE2EDuration="22.236662477s" podCreationTimestamp="2026-03-14 08:49:00 +0000 UTC" firstStartedPulling="2026-03-14 08:49:08.276336387 +0000 UTC m=+1283.524788024" lastFinishedPulling="2026-03-14 08:49:20.792994067 +0000 UTC m=+1296.041445704" observedRunningTime="2026-03-14 08:49:22.225444866 +0000 UTC m=+1297.473896493" watchObservedRunningTime="2026-03-14 08:49:22.236662477 +0000 UTC m=+1297.485114114" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.461834 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-qnvnf"] Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.464496 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.487939 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-qnvnf"] Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.578582 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.578622 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.578641 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-svc\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.578883 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcq6h\" (UniqueName: \"kubernetes.io/projected/84cb3b87-9994-4458-9e79-397787759551-kube-api-access-bcq6h\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.579069 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-config\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.579096 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.680993 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.681268 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.681292 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-svc\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.681354 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcq6h\" (UniqueName: \"kubernetes.io/projected/84cb3b87-9994-4458-9e79-397787759551-kube-api-access-bcq6h\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.681394 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-config\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.681410 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.682090 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.682745 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.682765 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.682957 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-svc\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.683355 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-config\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.706921 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcq6h\" (UniqueName: \"kubernetes.io/projected/84cb3b87-9994-4458-9e79-397787759551-kube-api-access-bcq6h\") pod \"dnsmasq-dns-895cf5cf-qnvnf\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:22 crc kubenswrapper[4886]: I0314 08:49:22.785912 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:23 crc kubenswrapper[4886]: I0314 08:49:23.212178 4886 generic.go:334] "Generic (PLEG): container finished" podID="755aaf87-8a64-479b-8a25-c16584e2a327" containerID="00926642a717f74835da4263cf7f52ff53875af39363c75fb092c888a5b6727b" exitCode=0 Mar 14 08:49:23 crc kubenswrapper[4886]: I0314 08:49:23.212311 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l-config-2kt44" event={"ID":"755aaf87-8a64-479b-8a25-c16584e2a327","Type":"ContainerDied","Data":"00926642a717f74835da4263cf7f52ff53875af39363c75fb092c888a5b6727b"} Mar 14 08:49:23 crc kubenswrapper[4886]: I0314 08:49:23.286607 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-qnvnf"] Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.230342 4886 generic.go:334] "Generic (PLEG): container finished" podID="84cb3b87-9994-4458-9e79-397787759551" containerID="315bd4d9831d8fa31bcea59b65819ab3ba2a95c5901ba078ae0562995915f572" exitCode=0 Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.230405 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" event={"ID":"84cb3b87-9994-4458-9e79-397787759551","Type":"ContainerDied","Data":"315bd4d9831d8fa31bcea59b65819ab3ba2a95c5901ba078ae0562995915f572"} Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.230863 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" event={"ID":"84cb3b87-9994-4458-9e79-397787759551","Type":"ContainerStarted","Data":"a15966708334ca2fa5a3ea6aac2cb1b0eeca906bad7378fc41815c8845598a3e"} Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.606498 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.714792 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run-ovn\") pod \"755aaf87-8a64-479b-8a25-c16584e2a327\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.714887 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wcbw\" (UniqueName: \"kubernetes.io/projected/755aaf87-8a64-479b-8a25-c16584e2a327-kube-api-access-5wcbw\") pod \"755aaf87-8a64-479b-8a25-c16584e2a327\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.714944 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-log-ovn\") pod \"755aaf87-8a64-479b-8a25-c16584e2a327\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.714931 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "755aaf87-8a64-479b-8a25-c16584e2a327" (UID: "755aaf87-8a64-479b-8a25-c16584e2a327"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.715077 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "755aaf87-8a64-479b-8a25-c16584e2a327" (UID: "755aaf87-8a64-479b-8a25-c16584e2a327"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.715101 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-scripts\") pod \"755aaf87-8a64-479b-8a25-c16584e2a327\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.715191 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run\") pod \"755aaf87-8a64-479b-8a25-c16584e2a327\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.715232 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run" (OuterVolumeSpecName: "var-run") pod "755aaf87-8a64-479b-8a25-c16584e2a327" (UID: "755aaf87-8a64-479b-8a25-c16584e2a327"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.715250 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-additional-scripts\") pod \"755aaf87-8a64-479b-8a25-c16584e2a327\" (UID: \"755aaf87-8a64-479b-8a25-c16584e2a327\") " Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.715722 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "755aaf87-8a64-479b-8a25-c16584e2a327" (UID: "755aaf87-8a64-479b-8a25-c16584e2a327"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.716073 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-scripts" (OuterVolumeSpecName: "scripts") pod "755aaf87-8a64-479b-8a25-c16584e2a327" (UID: "755aaf87-8a64-479b-8a25-c16584e2a327"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.716090 4886 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.716106 4886 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.716128 4886 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.716137 4886 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/755aaf87-8a64-479b-8a25-c16584e2a327-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.720904 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755aaf87-8a64-479b-8a25-c16584e2a327-kube-api-access-5wcbw" (OuterVolumeSpecName: "kube-api-access-5wcbw") pod "755aaf87-8a64-479b-8a25-c16584e2a327" (UID: "755aaf87-8a64-479b-8a25-c16584e2a327"). InnerVolumeSpecName "kube-api-access-5wcbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.818015 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wcbw\" (UniqueName: \"kubernetes.io/projected/755aaf87-8a64-479b-8a25-c16584e2a327-kube-api-access-5wcbw\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:24 crc kubenswrapper[4886]: I0314 08:49:24.818051 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/755aaf87-8a64-479b-8a25-c16584e2a327-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.240513 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9k99l-config-2kt44" event={"ID":"755aaf87-8a64-479b-8a25-c16584e2a327","Type":"ContainerDied","Data":"61f237df9bc0ccbb244d59ab222269d74712d05b229eee1ffe73608d86e744ad"} Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.240587 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f237df9bc0ccbb244d59ab222269d74712d05b229eee1ffe73608d86e744ad" Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.240541 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9k99l-config-2kt44" Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.243794 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerStarted","Data":"bb678afc7bfbc641254b64f4aa04b60a3365d4a86fac8399cf5e5da745578fdb"} Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.243841 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerStarted","Data":"7395d1173216948ed510ba033503e5c5f55b3f699fdca45e95a301c0f6e67729"} Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.245424 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" event={"ID":"84cb3b87-9994-4458-9e79-397787759551","Type":"ContainerStarted","Data":"6b7e2e3e7f424ce723a8646fb1fffb273858b0aa03254af8a008f06f8f86dc35"} Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.245544 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.246515 4886 generic.go:334] "Generic (PLEG): container finished" podID="fb94c19d-031e-44b6-bdaa-39141d037b36" containerID="1e47481845c53f199577f7274a38447001a233e5b7cc38957657c9fb9acfa05a" exitCode=0 Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.246576 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jgb85" event={"ID":"fb94c19d-031e-44b6-bdaa-39141d037b36","Type":"ContainerDied","Data":"1e47481845c53f199577f7274a38447001a233e5b7cc38957657c9fb9acfa05a"} Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.247631 4886 generic.go:334] "Generic (PLEG): container finished" podID="9083ef8e-f321-4442-871b-c82f908bd073" containerID="3ba7363b5c1dda9754d6898345afb2bb662e9ecad34ee2f614d53bddb412141e" exitCode=0 Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.247656 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rc2gb" event={"ID":"9083ef8e-f321-4442-871b-c82f908bd073","Type":"ContainerDied","Data":"3ba7363b5c1dda9754d6898345afb2bb662e9ecad34ee2f614d53bddb412141e"} Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.277869 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.277853133 podStartE2EDuration="18.277853133s" podCreationTimestamp="2026-03-14 08:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:25.272059182 +0000 UTC m=+1300.520510839" watchObservedRunningTime="2026-03-14 08:49:25.277853133 +0000 UTC m=+1300.526304770" Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.300751 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" podStartSLOduration=3.300733288 podStartE2EDuration="3.300733288s" podCreationTimestamp="2026-03-14 08:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:25.291106341 +0000 UTC m=+1300.539557998" watchObservedRunningTime="2026-03-14 08:49:25.300733288 +0000 UTC m=+1300.549184925" Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.701195 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9k99l-config-2kt44"] Mar 14 08:49:25 crc kubenswrapper[4886]: I0314 08:49:25.710031 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9k99l-config-2kt44"] Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.065629 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.065940 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.641477 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.751938 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-config-data\") pod \"fb94c19d-031e-44b6-bdaa-39141d037b36\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.751998 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-combined-ca-bundle\") pod \"fb94c19d-031e-44b6-bdaa-39141d037b36\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.752081 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-db-sync-config-data\") pod \"fb94c19d-031e-44b6-bdaa-39141d037b36\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.752168 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tzvj\" (UniqueName: \"kubernetes.io/projected/fb94c19d-031e-44b6-bdaa-39141d037b36-kube-api-access-4tzvj\") pod \"fb94c19d-031e-44b6-bdaa-39141d037b36\" (UID: \"fb94c19d-031e-44b6-bdaa-39141d037b36\") " Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.758911 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb94c19d-031e-44b6-bdaa-39141d037b36-kube-api-access-4tzvj" (OuterVolumeSpecName: "kube-api-access-4tzvj") pod "fb94c19d-031e-44b6-bdaa-39141d037b36" (UID: "fb94c19d-031e-44b6-bdaa-39141d037b36"). InnerVolumeSpecName "kube-api-access-4tzvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.759469 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fb94c19d-031e-44b6-bdaa-39141d037b36" (UID: "fb94c19d-031e-44b6-bdaa-39141d037b36"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.784456 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb94c19d-031e-44b6-bdaa-39141d037b36" (UID: "fb94c19d-031e-44b6-bdaa-39141d037b36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.801256 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-config-data" (OuterVolumeSpecName: "config-data") pod "fb94c19d-031e-44b6-bdaa-39141d037b36" (UID: "fb94c19d-031e-44b6-bdaa-39141d037b36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.853755 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.853786 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.853797 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tzvj\" (UniqueName: \"kubernetes.io/projected/fb94c19d-031e-44b6-bdaa-39141d037b36-kube-api-access-4tzvj\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.853811 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb94c19d-031e-44b6-bdaa-39141d037b36-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:26 crc kubenswrapper[4886]: I0314 08:49:26.879474 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.057620 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-combined-ca-bundle\") pod \"9083ef8e-f321-4442-871b-c82f908bd073\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.057706 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-config-data\") pod \"9083ef8e-f321-4442-871b-c82f908bd073\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.057809 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7nql\" (UniqueName: \"kubernetes.io/projected/9083ef8e-f321-4442-871b-c82f908bd073-kube-api-access-x7nql\") pod \"9083ef8e-f321-4442-871b-c82f908bd073\" (UID: \"9083ef8e-f321-4442-871b-c82f908bd073\") " Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.062001 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9083ef8e-f321-4442-871b-c82f908bd073-kube-api-access-x7nql" (OuterVolumeSpecName: "kube-api-access-x7nql") pod "9083ef8e-f321-4442-871b-c82f908bd073" (UID: "9083ef8e-f321-4442-871b-c82f908bd073"). InnerVolumeSpecName "kube-api-access-x7nql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.084014 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9083ef8e-f321-4442-871b-c82f908bd073" (UID: "9083ef8e-f321-4442-871b-c82f908bd073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.112304 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-config-data" (OuterVolumeSpecName: "config-data") pod "9083ef8e-f321-4442-871b-c82f908bd073" (UID: "9083ef8e-f321-4442-871b-c82f908bd073"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.160279 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7nql\" (UniqueName: \"kubernetes.io/projected/9083ef8e-f321-4442-871b-c82f908bd073-kube-api-access-x7nql\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.160334 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.160345 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9083ef8e-f321-4442-871b-c82f908bd073-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.269589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jgb85" event={"ID":"fb94c19d-031e-44b6-bdaa-39141d037b36","Type":"ContainerDied","Data":"48700777e7fe3282706a49000785bf06acf488c5369aaf2b662b1e1364001082"} Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.269692 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48700777e7fe3282706a49000785bf06acf488c5369aaf2b662b1e1364001082" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.269626 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jgb85" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.272197 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rc2gb" event={"ID":"9083ef8e-f321-4442-871b-c82f908bd073","Type":"ContainerDied","Data":"a7d4d09edb457682720b0f611fbec19b50efbc7d37f946b3f3922d5cb0abbcec"} Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.272279 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d4d09edb457682720b0f611fbec19b50efbc7d37f946b3f3922d5cb0abbcec" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.272290 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rc2gb" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.437380 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755aaf87-8a64-479b-8a25-c16584e2a327" path="/var/lib/kubelet/pods/755aaf87-8a64-479b-8a25-c16584e2a327/volumes" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.544602 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-qnvnf"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.545238 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" podUID="84cb3b87-9994-4458-9e79-397787759551" containerName="dnsmasq-dns" containerID="cri-o://6b7e2e3e7f424ce723a8646fb1fffb273858b0aa03254af8a008f06f8f86dc35" gracePeriod=10 Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613005 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dvv52"] Mar 14 08:49:27 crc kubenswrapper[4886]: E0314 08:49:27.613495 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9083ef8e-f321-4442-871b-c82f908bd073" containerName="keystone-db-sync" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613518 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9083ef8e-f321-4442-871b-c82f908bd073" containerName="keystone-db-sync" Mar 14 08:49:27 crc kubenswrapper[4886]: E0314 08:49:27.613535 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755aaf87-8a64-479b-8a25-c16584e2a327" containerName="ovn-config" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613544 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="755aaf87-8a64-479b-8a25-c16584e2a327" containerName="ovn-config" Mar 14 08:49:27 crc kubenswrapper[4886]: E0314 08:49:27.613567 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb94c19d-031e-44b6-bdaa-39141d037b36" containerName="watcher-db-sync" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613576 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb94c19d-031e-44b6-bdaa-39141d037b36" containerName="watcher-db-sync" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613809 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="755aaf87-8a64-479b-8a25-c16584e2a327" containerName="ovn-config" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613840 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9083ef8e-f321-4442-871b-c82f908bd073" containerName="keystone-db-sync" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.613854 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb94c19d-031e-44b6-bdaa-39141d037b36" containerName="watcher-db-sync" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.614947 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.631998 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dvv52"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.680294 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4csbp"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.682666 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.700478 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.701811 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4csbp"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.704802 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.704956 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.705057 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.707719 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2prkb" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.782022 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-config\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.782077 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.782110 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.782202 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqgh\" (UniqueName: \"kubernetes.io/projected/0630eb24-b78a-4a41-9f22-92361fd4cf45-kube-api-access-cwqgh\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.782328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.782398 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.825194 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.842675 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.849576 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-tn7ss" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.854692 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.867918 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-fernet-keys\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884210 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-combined-ca-bundle\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884259 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-config\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884280 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884304 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884333 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-config-data\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884358 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqgh\" (UniqueName: \"kubernetes.io/projected/0630eb24-b78a-4a41-9f22-92361fd4cf45-kube-api-access-cwqgh\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884382 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-credential-keys\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884404 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhncp\" (UniqueName: \"kubernetes.io/projected/050cc40a-837e-4bc3-9274-726e234207a4-kube-api-access-lhncp\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884446 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884490 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.884517 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-scripts\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.885437 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-config\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.885994 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.886522 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.899789 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.900058 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.900515 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.902485 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.914117 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.921708 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.961708 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqgh\" (UniqueName: \"kubernetes.io/projected/0630eb24-b78a-4a41-9f22-92361fd4cf45-kube-api-access-cwqgh\") pod \"dnsmasq-dns-6c9c9f998c-dvv52\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994058 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-fernet-keys\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994110 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-config-data\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994146 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-combined-ca-bundle\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994180 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994204 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-logs\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lx7\" (UniqueName: \"kubernetes.io/projected/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-kube-api-access-l7lx7\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-config-data\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994289 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e651d7-c7c1-46d6-9097-f769d9e64d4e-logs\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994305 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-credential-keys\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994321 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhncp\" (UniqueName: \"kubernetes.io/projected/050cc40a-837e-4bc3-9274-726e234207a4-kube-api-access-lhncp\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994346 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k9wg\" (UniqueName: \"kubernetes.io/projected/54e651d7-c7c1-46d6-9097-f769d9e64d4e-kube-api-access-5k9wg\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994370 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994417 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-scripts\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994433 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:27 crc kubenswrapper[4886]: I0314 08:49:27.994454 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.014301 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-config-data\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.014767 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-fernet-keys\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.016013 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sbjqr"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.016103 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-combined-ca-bundle\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.017690 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.022623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-credential-keys\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.040290 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57f8bb66ff-scjrm"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.043647 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.051304 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-scripts\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.052569 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.052935 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.053193 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sx7r8" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.053497 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.060643 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sbjqr"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.061065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhncp\" (UniqueName: \"kubernetes.io/projected/050cc40a-837e-4bc3-9274-726e234207a4-kube-api-access-lhncp\") pod \"keystone-bootstrap-4csbp\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.064249 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.068688 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-wxshv" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.068996 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.073421 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.081066 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f8bb66ff-scjrm"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101045 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k9wg\" (UniqueName: \"kubernetes.io/projected/54e651d7-c7c1-46d6-9097-f769d9e64d4e-kube-api-access-5k9wg\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101103 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101167 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101188 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101216 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-config-data\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101241 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101264 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-logs\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101289 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lx7\" (UniqueName: \"kubernetes.io/projected/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-kube-api-access-l7lx7\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101331 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e651d7-c7c1-46d6-9097-f769d9e64d4e-logs\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.101692 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e651d7-c7c1-46d6-9097-f769d9e64d4e-logs\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.103988 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-logs\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.105650 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.111495 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.126414 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.128511 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.129564 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.130361 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.131883 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-config-data\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.149607 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6frh2"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.156043 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.156603 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.164706 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.165104 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wkpjr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.168472 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.176250 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lx7\" (UniqueName: \"kubernetes.io/projected/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-kube-api-access-l7lx7\") pod \"watcher-decision-engine-0\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.181544 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.187806 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k9wg\" (UniqueName: \"kubernetes.io/projected/54e651d7-c7c1-46d6-9097-f769d9e64d4e-kube-api-access-5k9wg\") pod \"watcher-applier-0\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203075 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-config\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203137 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-config-data\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203187 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msfm5\" (UniqueName: \"kubernetes.io/projected/a3b9e706-7b47-409b-91a9-8457dfa315f1-kube-api-access-msfm5\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203254 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-combined-ca-bundle\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203278 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3b9e706-7b47-409b-91a9-8457dfa315f1-horizon-secret-key\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203300 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsqs\" (UniqueName: \"kubernetes.io/projected/386f7c41-cb62-4ff1-bef7-11e4e8b14707-kube-api-access-jlsqs\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203325 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-scripts\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.203357 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b9e706-7b47-409b-91a9-8457dfa315f1-logs\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.238676 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.243727 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.244170 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.274187 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.305885 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-config-data\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.305943 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-config-data\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.305983 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msfm5\" (UniqueName: \"kubernetes.io/projected/a3b9e706-7b47-409b-91a9-8457dfa315f1-kube-api-access-msfm5\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306021 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-config-data\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306042 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306062 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c90616-a536-4fab-911b-2fd02c52ef9d-logs\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306086 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cjt\" (UniqueName: \"kubernetes.io/projected/14718224-eaad-4caf-b13b-a60a9c2a9460-kube-api-access-d4cjt\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306111 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-combined-ca-bundle\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306169 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306188 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-scripts\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306220 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3b9e706-7b47-409b-91a9-8457dfa315f1-horizon-secret-key\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306245 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdsp\" (UniqueName: \"kubernetes.io/projected/e4c90616-a536-4fab-911b-2fd02c52ef9d-kube-api-access-zzdsp\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306263 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsqs\" (UniqueName: \"kubernetes.io/projected/386f7c41-cb62-4ff1-bef7-11e4e8b14707-kube-api-access-jlsqs\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306282 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14718224-eaad-4caf-b13b-a60a9c2a9460-etc-machine-id\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306493 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-scripts\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306522 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b9e706-7b47-409b-91a9-8457dfa315f1-logs\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306545 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-db-sync-config-data\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306567 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-combined-ca-bundle\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.306583 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-config\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.309679 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-config-data\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.311955 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b9e706-7b47-409b-91a9-8457dfa315f1-logs\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.314195 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-config\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.314958 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3b9e706-7b47-409b-91a9-8457dfa315f1-horizon-secret-key\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.315539 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-combined-ca-bundle\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.315758 4886 generic.go:334] "Generic (PLEG): container finished" podID="84cb3b87-9994-4458-9e79-397787759551" containerID="6b7e2e3e7f424ce723a8646fb1fffb273858b0aa03254af8a008f06f8f86dc35" exitCode=0 Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.315789 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" event={"ID":"84cb3b87-9994-4458-9e79-397787759551","Type":"ContainerDied","Data":"6b7e2e3e7f424ce723a8646fb1fffb273858b0aa03254af8a008f06f8f86dc35"} Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.322769 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-scripts\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.350951 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsqs\" (UniqueName: \"kubernetes.io/projected/386f7c41-cb62-4ff1-bef7-11e4e8b14707-kube-api-access-jlsqs\") pod \"neutron-db-sync-sbjqr\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.359688 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msfm5\" (UniqueName: \"kubernetes.io/projected/a3b9e706-7b47-409b-91a9-8457dfa315f1-kube-api-access-msfm5\") pod \"horizon-57f8bb66ff-scjrm\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.379179 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.381918 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.406509 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.406708 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408615 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-config-data\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408671 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408691 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c90616-a536-4fab-911b-2fd02c52ef9d-logs\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408716 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cjt\" (UniqueName: \"kubernetes.io/projected/14718224-eaad-4caf-b13b-a60a9c2a9460-kube-api-access-d4cjt\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408753 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408769 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-scripts\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408802 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdsp\" (UniqueName: \"kubernetes.io/projected/e4c90616-a536-4fab-911b-2fd02c52ef9d-kube-api-access-zzdsp\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408822 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14718224-eaad-4caf-b13b-a60a9c2a9460-etc-machine-id\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408915 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-db-sync-config-data\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408944 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-combined-ca-bundle\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.408985 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-config-data\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.411557 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c90616-a536-4fab-911b-2fd02c52ef9d-logs\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.414974 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14718224-eaad-4caf-b13b-a60a9c2a9460-etc-machine-id\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.421013 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-combined-ca-bundle\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.433816 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cjt\" (UniqueName: \"kubernetes.io/projected/14718224-eaad-4caf-b13b-a60a9c2a9460-kube-api-access-d4cjt\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.439285 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-config-data\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.442104 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.447725 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6frh2"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.451787 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-scripts\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.451937 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-db-sync-config-data\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.453602 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdsp\" (UniqueName: \"kubernetes.io/projected/e4c90616-a536-4fab-911b-2fd02c52ef9d-kube-api-access-zzdsp\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.457769 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-config-data\") pod \"cinder-db-sync-6frh2\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.463968 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.467591 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.494955 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-scripts\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510334 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-run-httpd\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510370 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-config-data\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510395 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-log-httpd\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510428 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510452 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj2b4\" (UniqueName: \"kubernetes.io/projected/a9eb9137-a021-4ea6-a4a4-871cf81af732-kube-api-access-nj2b4\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.510473 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.515272 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.586078 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.587972 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8wtzc"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.598379 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.600076 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6frh2" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612308 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-scripts\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612346 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-run-httpd\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612381 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-config-data\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612407 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-log-httpd\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612436 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612459 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj2b4\" (UniqueName: \"kubernetes.io/projected/a9eb9137-a021-4ea6-a4a4-871cf81af732-kube-api-access-nj2b4\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.612481 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.613723 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-run-httpd\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.614200 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-log-httpd\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.619226 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.619746 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.626647 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.627349 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lqdcq" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.628622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-scripts\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.629650 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dvv52"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.631660 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-config-data\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.679201 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj2b4\" (UniqueName: \"kubernetes.io/projected/a9eb9137-a021-4ea6-a4a4-871cf81af732-kube-api-access-nj2b4\") pod \"ceilometer-0\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.685289 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8wtzc"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.725935 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.727466 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqh2p\" (UniqueName: \"kubernetes.io/projected/e521aeb3-adb2-4042-ad11-33d749d5506b-kube-api-access-bqh2p\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.727543 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-combined-ca-bundle\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.727564 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-db-sync-config-data\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.771258 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dfdc8544f-8k2lm"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.772656 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.804136 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ctl2l"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.806137 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.810639 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xpzz5" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.810871 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.810992 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.830815 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-combined-ca-bundle\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.830867 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-db-sync-config-data\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.831059 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-config-data\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.832250 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25srx\" (UniqueName: \"kubernetes.io/projected/34551f6d-235f-4fee-939a-450195242f3e-kube-api-access-25srx\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.832325 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-scripts\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.832360 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqh2p\" (UniqueName: \"kubernetes.io/projected/e521aeb3-adb2-4042-ad11-33d749d5506b-kube-api-access-bqh2p\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.832437 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34551f6d-235f-4fee-939a-450195242f3e-logs\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.832466 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/34551f6d-235f-4fee-939a-450195242f3e-horizon-secret-key\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.865692 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-combined-ca-bundle\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.868302 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-db-sync-config-data\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.889148 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqh2p\" (UniqueName: \"kubernetes.io/projected/e521aeb3-adb2-4042-ad11-33d749d5506b-kube-api-access-bqh2p\") pod \"barbican-db-sync-8wtzc\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.896517 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dfdc8544f-8k2lm"] Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.920423 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943536 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-config-data\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943602 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-scripts\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943632 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34551f6d-235f-4fee-939a-450195242f3e-logs\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943661 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/34551f6d-235f-4fee-939a-450195242f3e-horizon-secret-key\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943770 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-config-data\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943803 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-combined-ca-bundle\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943832 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25srx\" (UniqueName: \"kubernetes.io/projected/34551f6d-235f-4fee-939a-450195242f3e-kube-api-access-25srx\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943864 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kzj\" (UniqueName: \"kubernetes.io/projected/29f258e4-6012-4807-95a6-cce9ee5af3d8-kube-api-access-t7kzj\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943904 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f258e4-6012-4807-95a6-cce9ee5af3d8-logs\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.943935 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-scripts\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.944981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-scripts\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.945332 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34551f6d-235f-4fee-939a-450195242f3e-logs\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.948503 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-config-data\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:28 crc kubenswrapper[4886]: I0314 08:49:28.962653 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/34551f6d-235f-4fee-939a-450195242f3e-horizon-secret-key\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.005816 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-m899h"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.007527 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.008601 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25srx\" (UniqueName: \"kubernetes.io/projected/34551f6d-235f-4fee-939a-450195242f3e-kube-api-access-25srx\") pod \"horizon-6dfdc8544f-8k2lm\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.009102 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.025255 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ctl2l"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.140562 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-config\") pod \"84cb3b87-9994-4458-9e79-397787759551\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.140983 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-swift-storage-0\") pod \"84cb3b87-9994-4458-9e79-397787759551\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.141033 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-sb\") pod \"84cb3b87-9994-4458-9e79-397787759551\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.141066 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-nb\") pod \"84cb3b87-9994-4458-9e79-397787759551\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.141177 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcq6h\" (UniqueName: \"kubernetes.io/projected/84cb3b87-9994-4458-9e79-397787759551-kube-api-access-bcq6h\") pod \"84cb3b87-9994-4458-9e79-397787759551\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.141256 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-svc\") pod \"84cb3b87-9994-4458-9e79-397787759551\" (UID: \"84cb3b87-9994-4458-9e79-397787759551\") " Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169200 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-scripts\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169272 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169348 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169383 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169521 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-config\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169760 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/9a3ebe21-0432-4b40-8e80-5369de346831-kube-api-access-8pvbh\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.169921 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-combined-ca-bundle\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.170023 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kzj\" (UniqueName: \"kubernetes.io/projected/29f258e4-6012-4807-95a6-cce9ee5af3d8-kube-api-access-t7kzj\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.170060 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f258e4-6012-4807-95a6-cce9ee5af3d8-logs\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.170080 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.170314 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-config-data\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.194188 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f258e4-6012-4807-95a6-cce9ee5af3d8-logs\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.194719 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-m899h"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.196997 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-scripts\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.205057 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-config-data\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.211534 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kzj\" (UniqueName: \"kubernetes.io/projected/29f258e4-6012-4807-95a6-cce9ee5af3d8-kube-api-access-t7kzj\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.212182 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-combined-ca-bundle\") pod \"placement-db-sync-ctl2l\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.217061 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ctl2l" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.232347 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cb3b87-9994-4458-9e79-397787759551-kube-api-access-bcq6h" (OuterVolumeSpecName: "kube-api-access-bcq6h") pod "84cb3b87-9994-4458-9e79-397787759551" (UID: "84cb3b87-9994-4458-9e79-397787759551"). InnerVolumeSpecName "kube-api-access-bcq6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.273435 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.273756 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.273804 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.274401 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.274862 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.275172 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.275291 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.275355 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-config\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.275480 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/9a3ebe21-0432-4b40-8e80-5369de346831-kube-api-access-8pvbh\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.275711 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcq6h\" (UniqueName: \"kubernetes.io/projected/84cb3b87-9994-4458-9e79-397787759551-kube-api-access-bcq6h\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.276148 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-config\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.279598 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.286960 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.294232 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:49:29 crc kubenswrapper[4886]: E0314 08:49:29.294661 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cb3b87-9994-4458-9e79-397787759551" containerName="init" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.294674 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cb3b87-9994-4458-9e79-397787759551" containerName="init" Mar 14 08:49:29 crc kubenswrapper[4886]: E0314 08:49:29.294717 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cb3b87-9994-4458-9e79-397787759551" containerName="dnsmasq-dns" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.294724 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cb3b87-9994-4458-9e79-397787759551" containerName="dnsmasq-dns" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.294890 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cb3b87-9994-4458-9e79-397787759551" containerName="dnsmasq-dns" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.296996 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.302188 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.302420 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.302484 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.305212 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-msq2v" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.314228 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/9a3ebe21-0432-4b40-8e80-5369de346831-kube-api-access-8pvbh\") pod \"dnsmasq-dns-57c957c4ff-m899h\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.314304 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.355684 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84cb3b87-9994-4458-9e79-397787759551" (UID: "84cb3b87-9994-4458-9e79-397787759551"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.362577 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4csbp" event={"ID":"050cc40a-837e-4bc3-9274-726e234207a4","Type":"ContainerStarted","Data":"01f852b6a81dcc510fbcfe68f61eaac2dc9a0567dbf909c361abfb15d6bea59c"} Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.364188 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-config" (OuterVolumeSpecName: "config") pod "84cb3b87-9994-4458-9e79-397787759551" (UID: "84cb3b87-9994-4458-9e79-397787759551"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.377881 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.377937 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.401765 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" event={"ID":"84cb3b87-9994-4458-9e79-397787759551","Type":"ContainerDied","Data":"a15966708334ca2fa5a3ea6aac2cb1b0eeca906bad7378fc41815c8845598a3e"} Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.401836 4886 scope.go:117] "RemoveContainer" containerID="6b7e2e3e7f424ce723a8646fb1fffb273858b0aa03254af8a008f06f8f86dc35" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.402150 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-qnvnf" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.412817 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.434821 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84cb3b87-9994-4458-9e79-397787759551" (UID: "84cb3b87-9994-4458-9e79-397787759551"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.446376 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84cb3b87-9994-4458-9e79-397787759551" (UID: "84cb3b87-9994-4458-9e79-397787759551"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.463132 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "84cb3b87-9994-4458-9e79-397787759551" (UID: "84cb3b87-9994-4458-9e79-397787759551"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479162 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479277 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-scripts\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479340 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rwdp\" (UniqueName: \"kubernetes.io/projected/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-kube-api-access-8rwdp\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479380 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479420 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-config-data\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479471 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479490 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-logs\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479518 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479590 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479602 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.479611 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84cb3b87-9994-4458-9e79-397787759551-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.499675 4886 scope.go:117] "RemoveContainer" containerID="315bd4d9831d8fa31bcea59b65819ab3ba2a95c5901ba078ae0562995915f572" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.507469 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4csbp"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.535236 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.536800 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.541308 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.541479 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.547270 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612445 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-scripts\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612750 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rwdp\" (UniqueName: \"kubernetes.io/projected/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-kube-api-access-8rwdp\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612784 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612813 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-config-data\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612885 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612903 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-logs\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612930 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.612979 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.613377 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.614394 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-logs\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.615007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.622932 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-scripts\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.624155 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.639903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.642715 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-config-data\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.643584 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rwdp\" (UniqueName: \"kubernetes.io/projected/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-kube-api-access-8rwdp\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.667947 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.700341 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.704757 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dvv52"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.714853 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715013 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715077 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715097 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-logs\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715249 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715375 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.715483 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhf8\" (UniqueName: \"kubernetes.io/projected/4ede23dd-82b8-42ef-bdcd-d4be5637e457-kube-api-access-hvhf8\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.817405 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.817929 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-logs\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.817995 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.818052 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.818075 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.818144 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhf8\" (UniqueName: \"kubernetes.io/projected/4ede23dd-82b8-42ef-bdcd-d4be5637e457-kube-api-access-hvhf8\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.818224 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.818248 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.821650 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.822045 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-logs\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.827598 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.827801 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.828385 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.828792 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.829487 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.852578 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhf8\" (UniqueName: \"kubernetes.io/projected/4ede23dd-82b8-42ef-bdcd-d4be5637e457-kube-api-access-hvhf8\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.901191 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57f8bb66ff-scjrm"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.905856 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: W0314 08:49:29.926703 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b9e706_7b47_409b_91a9_8457dfa315f1.slice/crio-6b637c83a1f57e6337fbd21ae4f22a62d76cc971600acdeadb84b9246fcaa69e WatchSource:0}: Error finding container 6b637c83a1f57e6337fbd21ae4f22a62d76cc971600acdeadb84b9246fcaa69e: Status 404 returned error can't find the container with id 6b637c83a1f57e6337fbd21ae4f22a62d76cc971600acdeadb84b9246fcaa69e Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.927230 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.930647 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.953636 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-qnvnf"] Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.963747 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:49:29 crc kubenswrapper[4886]: I0314 08:49:29.973809 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-qnvnf"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.424763 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f8bb66ff-scjrm" event={"ID":"a3b9e706-7b47-409b-91a9-8457dfa315f1","Type":"ContainerStarted","Data":"6b637c83a1f57e6337fbd21ae4f22a62d76cc971600acdeadb84b9246fcaa69e"} Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.430278 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b41275dd-03d8-40b8-9f06-0dc67ecb12e6","Type":"ContainerStarted","Data":"3218e5ea026ad53abd6ba2970afb27d44ad4d20400d9589d591ae4d8fdac89a6"} Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.443045 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"54e651d7-c7c1-46d6-9097-f769d9e64d4e","Type":"ContainerStarted","Data":"feac57f6227b7148b34b7e0c10ad442d30e4d6ae539c07b02ae2013658df4031"} Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.446618 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4csbp" event={"ID":"050cc40a-837e-4bc3-9274-726e234207a4","Type":"ContainerStarted","Data":"e4e3d8542c0ff153c71f271b40046075bbe9c3157333be6695b54597867dafee"} Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.451060 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sbjqr"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.480426 4886 generic.go:334] "Generic (PLEG): container finished" podID="0630eb24-b78a-4a41-9f22-92361fd4cf45" containerID="3de65d46058987ae458b6fdee9e29a449d24eed75c94d580ea3433b66c530d48" exitCode=0 Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.480462 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" event={"ID":"0630eb24-b78a-4a41-9f22-92361fd4cf45","Type":"ContainerDied","Data":"3de65d46058987ae458b6fdee9e29a449d24eed75c94d580ea3433b66c530d48"} Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.480485 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" event={"ID":"0630eb24-b78a-4a41-9f22-92361fd4cf45","Type":"ContainerStarted","Data":"699e0222667b8cd7a4c095292e43b0a093021d6132fcbd120a432e5f82d70b2d"} Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.489457 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8wtzc"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.650074 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6frh2"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.741166 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.756692 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ctl2l"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.776667 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.785318 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4csbp" podStartSLOduration=3.785295161 podStartE2EDuration="3.785295161s" podCreationTimestamp="2026-03-14 08:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:30.475074161 +0000 UTC m=+1305.723525808" watchObservedRunningTime="2026-03-14 08:49:30.785295161 +0000 UTC m=+1306.033746798" Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.818385 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dfdc8544f-8k2lm"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.862417 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-m899h"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.873927 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:49:30 crc kubenswrapper[4886]: I0314 08:49:30.986985 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.218293 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.243613 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.284001 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57f8bb66ff-scjrm"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.306071 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-config\") pod \"0630eb24-b78a-4a41-9f22-92361fd4cf45\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.307608 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-nb\") pod \"0630eb24-b78a-4a41-9f22-92361fd4cf45\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.307690 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwqgh\" (UniqueName: \"kubernetes.io/projected/0630eb24-b78a-4a41-9f22-92361fd4cf45-kube-api-access-cwqgh\") pod \"0630eb24-b78a-4a41-9f22-92361fd4cf45\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.307810 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-svc\") pod \"0630eb24-b78a-4a41-9f22-92361fd4cf45\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.308005 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-swift-storage-0\") pod \"0630eb24-b78a-4a41-9f22-92361fd4cf45\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.308173 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-sb\") pod \"0630eb24-b78a-4a41-9f22-92361fd4cf45\" (UID: \"0630eb24-b78a-4a41-9f22-92361fd4cf45\") " Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.323875 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.361291 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0630eb24-b78a-4a41-9f22-92361fd4cf45-kube-api-access-cwqgh" (OuterVolumeSpecName: "kube-api-access-cwqgh") pod "0630eb24-b78a-4a41-9f22-92361fd4cf45" (UID: "0630eb24-b78a-4a41-9f22-92361fd4cf45"). InnerVolumeSpecName "kube-api-access-cwqgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.377888 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0630eb24-b78a-4a41-9f22-92361fd4cf45" (UID: "0630eb24-b78a-4a41-9f22-92361fd4cf45"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.412565 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwqgh\" (UniqueName: \"kubernetes.io/projected/0630eb24-b78a-4a41-9f22-92361fd4cf45-kube-api-access-cwqgh\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.412763 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.434620 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0630eb24-b78a-4a41-9f22-92361fd4cf45" (UID: "0630eb24-b78a-4a41-9f22-92361fd4cf45"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.488361 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0630eb24-b78a-4a41-9f22-92361fd4cf45" (UID: "0630eb24-b78a-4a41-9f22-92361fd4cf45"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.518855 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.518892 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.547324 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-config" (OuterVolumeSpecName: "config") pod "0630eb24-b78a-4a41-9f22-92361fd4cf45" (UID: "0630eb24-b78a-4a41-9f22-92361fd4cf45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.620312 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.622664 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0630eb24-b78a-4a41-9f22-92361fd4cf45" (UID: "0630eb24-b78a-4a41-9f22-92361fd4cf45"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.721661 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cb3b87-9994-4458-9e79-397787759551" path="/var/lib/kubelet/pods/84cb3b87-9994-4458-9e79-397787759551/volumes" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.721920 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0630eb24-b78a-4a41-9f22-92361fd4cf45-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.722564 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7496c4d65c-dg8pn"] Mar 14 08:49:31 crc kubenswrapper[4886]: E0314 08:49:31.722868 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0630eb24-b78a-4a41-9f22-92361fd4cf45" containerName="init" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.722880 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0630eb24-b78a-4a41-9f22-92361fd4cf45" containerName="init" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.723035 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0630eb24-b78a-4a41-9f22-92361fd4cf45" containerName="init" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.724226 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7496c4d65c-dg8pn"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.724243 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.724256 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.724330 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.811603 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dfdc8544f-8k2lm" event={"ID":"34551f6d-235f-4fee-939a-450195242f3e","Type":"ContainerStarted","Data":"8288528be5d08cd63b86dbed38f8189a24de0c59c4b509abb10d876e480a094e"} Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.828236 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6624fe29-e15e-4474-a2d9-37489c04e1b6-horizon-secret-key\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.828294 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-config-data\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.828508 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85kdb\" (UniqueName: \"kubernetes.io/projected/6624fe29-e15e-4474-a2d9-37489c04e1b6-kube-api-access-85kdb\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.828539 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6624fe29-e15e-4474-a2d9-37489c04e1b6-logs\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.828584 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-scripts\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.833195 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerStarted","Data":"e5049349eabc89c825bd59285fb1f4d53dce6f041339804f18c3a3d3c39ca606"} Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.893610 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" event={"ID":"0630eb24-b78a-4a41-9f22-92361fd4cf45","Type":"ContainerDied","Data":"699e0222667b8cd7a4c095292e43b0a093021d6132fcbd120a432e5f82d70b2d"} Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.893668 4886 scope.go:117] "RemoveContainer" containerID="3de65d46058987ae458b6fdee9e29a449d24eed75c94d580ea3433b66c530d48" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.893836 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-dvv52" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.929140 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e4c90616-a536-4fab-911b-2fd02c52ef9d","Type":"ContainerStarted","Data":"dabf869885aa506944203f0c77528af1e2ee5750948366f2981c803617bca2e0"} Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.929211 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e4c90616-a536-4fab-911b-2fd02c52ef9d","Type":"ContainerStarted","Data":"c94b649e1f586fba8e8066b671afc9d627c4903cdf4bcdaae10f4ef90f57688a"} Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.939966 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6624fe29-e15e-4474-a2d9-37489c04e1b6-horizon-secret-key\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.940018 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-config-data\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.940166 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85kdb\" (UniqueName: \"kubernetes.io/projected/6624fe29-e15e-4474-a2d9-37489c04e1b6-kube-api-access-85kdb\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.940207 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6624fe29-e15e-4474-a2d9-37489c04e1b6-logs\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.940244 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-scripts\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.940922 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6624fe29-e15e-4474-a2d9-37489c04e1b6-logs\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.941095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-scripts\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.941563 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-config-data\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.948953 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6624fe29-e15e-4474-a2d9-37489c04e1b6-horizon-secret-key\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.978745 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85kdb\" (UniqueName: \"kubernetes.io/projected/6624fe29-e15e-4474-a2d9-37489c04e1b6-kube-api-access-85kdb\") pod \"horizon-7496c4d65c-dg8pn\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:31 crc kubenswrapper[4886]: I0314 08:49:31.982976 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8a040e4-11ae-4ffb-94dd-53e5f9962d53","Type":"ContainerStarted","Data":"fa25d003d5f7a360ae3344d569bb0943d575c486ed5fbea5939a1f6e63b4c4fa"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.018844 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sbjqr" event={"ID":"386f7c41-cb62-4ff1-bef7-11e4e8b14707","Type":"ContainerStarted","Data":"91110068c4fd7023df65e5b9e10f4ea94405c81cb5051853db689bc531c52836"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.018910 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sbjqr" event={"ID":"386f7c41-cb62-4ff1-bef7-11e4e8b14707","Type":"ContainerStarted","Data":"21d7519fcfc54ce2b294bb7c245165c5cdcb7b29f1ac60ba9b0ce9511baef2c4"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.029902 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wtzc" event={"ID":"e521aeb3-adb2-4042-ad11-33d749d5506b","Type":"ContainerStarted","Data":"14532d4740a3a1638ccea934054fdead6a2fb3350d314eda5705f6f49cc838f9"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.042377 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sbjqr" podStartSLOduration=5.042343875 podStartE2EDuration="5.042343875s" podCreationTimestamp="2026-03-14 08:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:32.041480501 +0000 UTC m=+1307.289932138" watchObservedRunningTime="2026-03-14 08:49:32.042343875 +0000 UTC m=+1307.290795512" Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.045198 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ctl2l" event={"ID":"29f258e4-6012-4807-95a6-cce9ee5af3d8","Type":"ContainerStarted","Data":"3b56229fbbd846fd6cf20640a98455cc0efe1826ecc0aee478b1ea5b165809d3"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.088652 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" event={"ID":"9a3ebe21-0432-4b40-8e80-5369de346831","Type":"ContainerStarted","Data":"aaf1e8077c9835a1b104157a36cf8544d0fb0152ff1f2c03566426e11c98b7f5"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.117682 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.149597 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6frh2" event={"ID":"14718224-eaad-4caf-b13b-a60a9c2a9460","Type":"ContainerStarted","Data":"ff144da160e7ce05c94bd894f4fbbd4be1875b6704b538dee35c710ce46e4f39"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.168960 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ede23dd-82b8-42ef-bdcd-d4be5637e457","Type":"ContainerStarted","Data":"bfca6d207ff3da65f25d2a68af19eb38af712746dd8b0fe2cc883e0759410c6c"} Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.190270 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dvv52"] Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.213054 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-dvv52"] Mar 14 08:49:32 crc kubenswrapper[4886]: I0314 08:49:32.711523 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7496c4d65c-dg8pn"] Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.191721 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8a040e4-11ae-4ffb-94dd-53e5f9962d53","Type":"ContainerStarted","Data":"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9"} Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.195105 4886 generic.go:334] "Generic (PLEG): container finished" podID="9a3ebe21-0432-4b40-8e80-5369de346831" containerID="86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01" exitCode=0 Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.195162 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" event={"ID":"9a3ebe21-0432-4b40-8e80-5369de346831","Type":"ContainerDied","Data":"86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01"} Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.198428 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ede23dd-82b8-42ef-bdcd-d4be5637e457","Type":"ContainerStarted","Data":"5d4f33cc733c4c780428838b0c1aa93986d723691a5fb2a33eb68f42aa298bee"} Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.201860 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e4c90616-a536-4fab-911b-2fd02c52ef9d","Type":"ContainerStarted","Data":"b7605021f0f98c16c58d134e47b74276c8417dbb6ed58cc20dce2d6d34263450"} Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.201967 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api-log" containerID="cri-o://dabf869885aa506944203f0c77528af1e2ee5750948366f2981c803617bca2e0" gracePeriod=30 Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.202067 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" containerID="cri-o://b7605021f0f98c16c58d134e47b74276c8417dbb6ed58cc20dce2d6d34263450" gracePeriod=30 Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.202330 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.216556 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": EOF" Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.238444 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=6.238425917 podStartE2EDuration="6.238425917s" podCreationTimestamp="2026-03-14 08:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:33.2374311 +0000 UTC m=+1308.485882737" watchObservedRunningTime="2026-03-14 08:49:33.238425917 +0000 UTC m=+1308.486877554" Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.434597 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0630eb24-b78a-4a41-9f22-92361fd4cf45" path="/var/lib/kubelet/pods/0630eb24-b78a-4a41-9f22-92361fd4cf45/volumes" Mar 14 08:49:33 crc kubenswrapper[4886]: I0314 08:49:33.586677 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 14 08:49:34 crc kubenswrapper[4886]: I0314 08:49:34.222547 4886 generic.go:334] "Generic (PLEG): container finished" podID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerID="dabf869885aa506944203f0c77528af1e2ee5750948366f2981c803617bca2e0" exitCode=143 Mar 14 08:49:34 crc kubenswrapper[4886]: I0314 08:49:34.222705 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e4c90616-a536-4fab-911b-2fd02c52ef9d","Type":"ContainerDied","Data":"dabf869885aa506944203f0c77528af1e2ee5750948366f2981c803617bca2e0"} Mar 14 08:49:35 crc kubenswrapper[4886]: I0314 08:49:35.253833 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" event={"ID":"9a3ebe21-0432-4b40-8e80-5369de346831","Type":"ContainerStarted","Data":"92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9"} Mar 14 08:49:35 crc kubenswrapper[4886]: I0314 08:49:35.254632 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:35 crc kubenswrapper[4886]: I0314 08:49:35.469939 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" podStartSLOduration=7.469915077 podStartE2EDuration="7.469915077s" podCreationTimestamp="2026-03-14 08:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:49:35.283320933 +0000 UTC m=+1310.531772570" watchObservedRunningTime="2026-03-14 08:49:35.469915077 +0000 UTC m=+1310.718366714" Mar 14 08:49:35 crc kubenswrapper[4886]: W0314 08:49:35.949502 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6624fe29_e15e_4474_a2d9_37489c04e1b6.slice/crio-a11d66d46dc65e4e38510e21078f86e2daf413f71c76931aa4b7cc33c8704854 WatchSource:0}: Error finding container a11d66d46dc65e4e38510e21078f86e2daf413f71c76931aa4b7cc33c8704854: Status 404 returned error can't find the container with id a11d66d46dc65e4e38510e21078f86e2daf413f71c76931aa4b7cc33c8704854 Mar 14 08:49:36 crc kubenswrapper[4886]: I0314 08:49:36.266723 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7496c4d65c-dg8pn" event={"ID":"6624fe29-e15e-4474-a2d9-37489c04e1b6","Type":"ContainerStarted","Data":"a11d66d46dc65e4e38510e21078f86e2daf413f71c76931aa4b7cc33c8704854"} Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.288879 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4csbp" event={"ID":"050cc40a-837e-4bc3-9274-726e234207a4","Type":"ContainerDied","Data":"e4e3d8542c0ff153c71f271b40046075bbe9c3157333be6695b54597867dafee"} Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.288831 4886 generic.go:334] "Generic (PLEG): container finished" podID="050cc40a-837e-4bc3-9274-726e234207a4" containerID="e4e3d8542c0ff153c71f271b40046075bbe9c3157333be6695b54597867dafee" exitCode=0 Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.772430 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dfdc8544f-8k2lm"] Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.823778 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66c6bc56b6-25jn4"] Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.825823 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.833229 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.836681 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66c6bc56b6-25jn4"] Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.902251 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7496c4d65c-dg8pn"] Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929132 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-secret-key\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929193 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwmh\" (UniqueName: \"kubernetes.io/projected/3f8100ac-c606-4eb3-afd6-07be9de44f42-kube-api-access-xvwmh\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929319 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-tls-certs\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929374 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-scripts\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8100ac-c606-4eb3-afd6-07be9de44f42-logs\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929458 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-combined-ca-bundle\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.929496 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-config-data\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.937813 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7769c88f5b-8gr9x"] Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.939925 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:37 crc kubenswrapper[4886]: I0314 08:49:37.961091 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7769c88f5b-8gr9x"] Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.032923 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-secret-key\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.032974 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwmh\" (UniqueName: \"kubernetes.io/projected/3f8100ac-c606-4eb3-afd6-07be9de44f42-kube-api-access-xvwmh\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnw5\" (UniqueName: \"kubernetes.io/projected/46272ed5-a9f5-45eb-b9ba-58289ed822a7-kube-api-access-bgnw5\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033029 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46272ed5-a9f5-45eb-b9ba-58289ed822a7-config-data\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033049 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46272ed5-a9f5-45eb-b9ba-58289ed822a7-scripts\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033091 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-tls-certs\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033126 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-horizon-secret-key\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033158 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-scripts\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033182 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8100ac-c606-4eb3-afd6-07be9de44f42-logs\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033209 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-combined-ca-bundle\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-horizon-tls-certs\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033248 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-config-data\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033267 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-combined-ca-bundle\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.033297 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46272ed5-a9f5-45eb-b9ba-58289ed822a7-logs\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.035165 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-config-data\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.035747 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-scripts\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.036033 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8100ac-c606-4eb3-afd6-07be9de44f42-logs\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.039470 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-secret-key\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.040295 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-combined-ca-bundle\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.041744 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-tls-certs\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.072662 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwmh\" (UniqueName: \"kubernetes.io/projected/3f8100ac-c606-4eb3-afd6-07be9de44f42-kube-api-access-xvwmh\") pod \"horizon-66c6bc56b6-25jn4\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.134888 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnw5\" (UniqueName: \"kubernetes.io/projected/46272ed5-a9f5-45eb-b9ba-58289ed822a7-kube-api-access-bgnw5\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.134961 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46272ed5-a9f5-45eb-b9ba-58289ed822a7-config-data\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.134994 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46272ed5-a9f5-45eb-b9ba-58289ed822a7-scripts\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.135057 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-horizon-secret-key\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.135172 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-horizon-tls-certs\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.135579 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-combined-ca-bundle\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.135754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46272ed5-a9f5-45eb-b9ba-58289ed822a7-logs\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.136542 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46272ed5-a9f5-45eb-b9ba-58289ed822a7-logs\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.136551 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/46272ed5-a9f5-45eb-b9ba-58289ed822a7-config-data\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.137868 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46272ed5-a9f5-45eb-b9ba-58289ed822a7-scripts\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.140641 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-horizon-tls-certs\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.141333 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-horizon-secret-key\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.141874 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46272ed5-a9f5-45eb-b9ba-58289ed822a7-combined-ca-bundle\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.143873 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.158031 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnw5\" (UniqueName: \"kubernetes.io/projected/46272ed5-a9f5-45eb-b9ba-58289ed822a7-kube-api-access-bgnw5\") pod \"horizon-7769c88f5b-8gr9x\" (UID: \"46272ed5-a9f5-45eb-b9ba-58289ed822a7\") " pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.265810 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.271812 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.280345 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.310160 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 08:49:38 crc kubenswrapper[4886]: I0314 08:49:38.633531 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:49:39 crc kubenswrapper[4886]: I0314 08:49:39.432877 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:49:39 crc kubenswrapper[4886]: I0314 08:49:39.514389 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-kz2x6"] Mar 14 08:49:39 crc kubenswrapper[4886]: I0314 08:49:39.514601 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" containerID="cri-o://1b86dd8377d7bbc4a602b82fc6e615137da41ed100266616b558bb02b1da6146" gracePeriod=10 Mar 14 08:49:39 crc kubenswrapper[4886]: I0314 08:49:39.881133 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": read tcp 10.217.0.2:33590->10.217.0.158:9322: read: connection reset by peer" Mar 14 08:49:40 crc kubenswrapper[4886]: I0314 08:49:40.331897 4886 generic.go:334] "Generic (PLEG): container finished" podID="1b496ccf-4298-4759-aacf-e115101cb90d" containerID="1b86dd8377d7bbc4a602b82fc6e615137da41ed100266616b558bb02b1da6146" exitCode=0 Mar 14 08:49:40 crc kubenswrapper[4886]: I0314 08:49:40.331975 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" event={"ID":"1b496ccf-4298-4759-aacf-e115101cb90d","Type":"ContainerDied","Data":"1b86dd8377d7bbc4a602b82fc6e615137da41ed100266616b558bb02b1da6146"} Mar 14 08:49:40 crc kubenswrapper[4886]: I0314 08:49:40.336722 4886 generic.go:334] "Generic (PLEG): container finished" podID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerID="b7605021f0f98c16c58d134e47b74276c8417dbb6ed58cc20dce2d6d34263450" exitCode=0 Mar 14 08:49:40 crc kubenswrapper[4886]: I0314 08:49:40.336768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e4c90616-a536-4fab-911b-2fd02c52ef9d","Type":"ContainerDied","Data":"b7605021f0f98c16c58d134e47b74276c8417dbb6ed58cc20dce2d6d34263450"} Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.676184 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.865642 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-combined-ca-bundle\") pod \"050cc40a-837e-4bc3-9274-726e234207a4\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.865817 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhncp\" (UniqueName: \"kubernetes.io/projected/050cc40a-837e-4bc3-9274-726e234207a4-kube-api-access-lhncp\") pod \"050cc40a-837e-4bc3-9274-726e234207a4\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.865849 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-credential-keys\") pod \"050cc40a-837e-4bc3-9274-726e234207a4\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.865906 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-config-data\") pod \"050cc40a-837e-4bc3-9274-726e234207a4\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.865928 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-fernet-keys\") pod \"050cc40a-837e-4bc3-9274-726e234207a4\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.866172 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-scripts\") pod \"050cc40a-837e-4bc3-9274-726e234207a4\" (UID: \"050cc40a-837e-4bc3-9274-726e234207a4\") " Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.874401 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050cc40a-837e-4bc3-9274-726e234207a4-kube-api-access-lhncp" (OuterVolumeSpecName: "kube-api-access-lhncp") pod "050cc40a-837e-4bc3-9274-726e234207a4" (UID: "050cc40a-837e-4bc3-9274-726e234207a4"). InnerVolumeSpecName "kube-api-access-lhncp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.875008 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "050cc40a-837e-4bc3-9274-726e234207a4" (UID: "050cc40a-837e-4bc3-9274-726e234207a4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.883056 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-scripts" (OuterVolumeSpecName: "scripts") pod "050cc40a-837e-4bc3-9274-726e234207a4" (UID: "050cc40a-837e-4bc3-9274-726e234207a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.894268 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "050cc40a-837e-4bc3-9274-726e234207a4" (UID: "050cc40a-837e-4bc3-9274-726e234207a4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.908295 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "050cc40a-837e-4bc3-9274-726e234207a4" (UID: "050cc40a-837e-4bc3-9274-726e234207a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.909203 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-config-data" (OuterVolumeSpecName: "config-data") pod "050cc40a-837e-4bc3-9274-726e234207a4" (UID: "050cc40a-837e-4bc3-9274-726e234207a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.968362 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.968398 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.968416 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhncp\" (UniqueName: \"kubernetes.io/projected/050cc40a-837e-4bc3-9274-726e234207a4-kube-api-access-lhncp\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.968430 4886 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.968443 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:42 crc kubenswrapper[4886]: I0314 08:49:42.968529 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/050cc40a-837e-4bc3-9274-726e234207a4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.380649 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4csbp" event={"ID":"050cc40a-837e-4bc3-9274-726e234207a4","Type":"ContainerDied","Data":"01f852b6a81dcc510fbcfe68f61eaac2dc9a0567dbf909c361abfb15d6bea59c"} Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.380712 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01f852b6a81dcc510fbcfe68f61eaac2dc9a0567dbf909c361abfb15d6bea59c" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.380736 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4csbp" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.592968 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": dial tcp 10.217.0.158:9322: connect: connection refused" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.864167 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4csbp"] Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.872238 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4csbp"] Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.974852 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vwjh5"] Mar 14 08:49:43 crc kubenswrapper[4886]: E0314 08:49:43.975702 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050cc40a-837e-4bc3-9274-726e234207a4" containerName="keystone-bootstrap" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.975738 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="050cc40a-837e-4bc3-9274-726e234207a4" containerName="keystone-bootstrap" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.976059 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="050cc40a-837e-4bc3-9274-726e234207a4" containerName="keystone-bootstrap" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.977274 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.980316 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.980375 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.980579 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2prkb" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.980829 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.982815 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:49:43 crc kubenswrapper[4886]: I0314 08:49:43.989503 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vwjh5"] Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.106453 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-combined-ca-bundle\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.106646 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-config-data\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.106754 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-fernet-keys\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.106806 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-scripts\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.106835 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwp58\" (UniqueName: \"kubernetes.io/projected/27d82e0e-40f0-45e0-b92f-df553a24bc5b-kube-api-access-rwp58\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.106863 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-credential-keys\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.208068 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-scripts\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.208159 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwp58\" (UniqueName: \"kubernetes.io/projected/27d82e0e-40f0-45e0-b92f-df553a24bc5b-kube-api-access-rwp58\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.208188 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-credential-keys\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.208239 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-combined-ca-bundle\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.208334 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-config-data\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.208396 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-fernet-keys\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.214882 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-credential-keys\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.215169 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-config-data\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.215986 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-fernet-keys\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.216267 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-combined-ca-bundle\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.216919 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-scripts\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.237194 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwp58\" (UniqueName: \"kubernetes.io/projected/27d82e0e-40f0-45e0-b92f-df553a24bc5b-kube-api-access-rwp58\") pod \"keystone-bootstrap-vwjh5\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.303765 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 14 08:49:44 crc kubenswrapper[4886]: I0314 08:49:44.306320 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:49:45 crc kubenswrapper[4886]: I0314 08:49:45.439242 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050cc40a-837e-4bc3-9274-726e234207a4" path="/var/lib/kubelet/pods/050cc40a-837e-4bc3-9274-726e234207a4/volumes" Mar 14 08:49:48 crc kubenswrapper[4886]: I0314 08:49:48.587584 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": dial tcp 10.217.0.158:9322: connect: connection refused" Mar 14 08:49:49 crc kubenswrapper[4886]: I0314 08:49:49.303639 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.047506 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.047847 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66h667h7h4h568h659h54dh77h569h577h578h79h7fh58ch96h6fh68h75h58ch649h59dh655h79h5c6h589h65h5b6hdh5b6h649h7h5f8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msfm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-57f8bb66ff-scjrm_openstack(a3b9e706-7b47-409b-91a9-8457dfa315f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.051468 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-57f8bb66ff-scjrm" podUID="a3b9e706-7b47-409b-91a9-8457dfa315f1" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.106453 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.106762 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n85hfch689h8bh655h7fh78h549h55bh54h5d4h675h57h59ch655h5b9h57h577hbh8fh5ch5c9h685h68h695h58ch559h656h655h6ch5bh566q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25srx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6dfdc8544f-8k2lm_openstack(34551f6d-235f-4fee-939a-450195242f3e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.108953 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6dfdc8544f-8k2lm" podUID="34551f6d-235f-4fee-939a-450195242f3e" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.581901 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 14 08:49:50 crc kubenswrapper[4886]: E0314 08:49:50.582548 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n546hd5h576h549h698h95hc8h5f9hd8h544h586h56bh645h5c7hd7h5fdh5dh574h5bh98h556h566h6h65fh59bhb5hbfh5bdh544h648h5cfh648q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nj2b4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a9eb9137-a021-4ea6-a4a4-871cf81af732): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:49:52 crc kubenswrapper[4886]: I0314 08:49:52.510365 4886 generic.go:334] "Generic (PLEG): container finished" podID="386f7c41-cb62-4ff1-bef7-11e4e8b14707" containerID="91110068c4fd7023df65e5b9e10f4ea94405c81cb5051853db689bc531c52836" exitCode=0 Mar 14 08:49:52 crc kubenswrapper[4886]: I0314 08:49:52.510593 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sbjqr" event={"ID":"386f7c41-cb62-4ff1-bef7-11e4e8b14707","Type":"ContainerDied","Data":"91110068c4fd7023df65e5b9e10f4ea94405c81cb5051853db689bc531c52836"} Mar 14 08:49:56 crc kubenswrapper[4886]: I0314 08:49:56.065669 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:49:56 crc kubenswrapper[4886]: I0314 08:49:56.066297 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.568864 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57f8bb66ff-scjrm" event={"ID":"a3b9e706-7b47-409b-91a9-8457dfa315f1","Type":"ContainerDied","Data":"6b637c83a1f57e6337fbd21ae4f22a62d76cc971600acdeadb84b9246fcaa69e"} Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.569296 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b637c83a1f57e6337fbd21ae4f22a62d76cc971600acdeadb84b9246fcaa69e" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.571519 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sbjqr" event={"ID":"386f7c41-cb62-4ff1-bef7-11e4e8b14707","Type":"ContainerDied","Data":"21d7519fcfc54ce2b294bb7c245165c5cdcb7b29f1ac60ba9b0ce9511baef2c4"} Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.571553 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21d7519fcfc54ce2b294bb7c245165c5cdcb7b29f1ac60ba9b0ce9511baef2c4" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.574378 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e4c90616-a536-4fab-911b-2fd02c52ef9d","Type":"ContainerDied","Data":"c94b649e1f586fba8e8066b671afc9d627c4903cdf4bcdaae10f4ef90f57688a"} Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.574404 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c94b649e1f586fba8e8066b671afc9d627c4903cdf4bcdaae10f4ef90f57688a" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.575683 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dfdc8544f-8k2lm" event={"ID":"34551f6d-235f-4fee-939a-450195242f3e","Type":"ContainerDied","Data":"8288528be5d08cd63b86dbed38f8189a24de0c59c4b509abb10d876e480a094e"} Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.575708 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8288528be5d08cd63b86dbed38f8189a24de0c59c4b509abb10d876e480a094e" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.577694 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" event={"ID":"1b496ccf-4298-4759-aacf-e115101cb90d","Type":"ContainerDied","Data":"d5bca59bbdfa864a46d4b7c709a26483c8f0b99dd6683d68c6852891d1a93d37"} Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.577719 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bca59bbdfa864a46d4b7c709a26483c8f0b99dd6683d68c6852891d1a93d37" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.588848 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.663586 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.672053 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-swift-storage-0\") pod \"1b496ccf-4298-4759-aacf-e115101cb90d\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.672254 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-sb\") pod \"1b496ccf-4298-4759-aacf-e115101cb90d\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.672322 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-svc\") pod \"1b496ccf-4298-4759-aacf-e115101cb90d\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.672350 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qszf8\" (UniqueName: \"kubernetes.io/projected/1b496ccf-4298-4759-aacf-e115101cb90d-kube-api-access-qszf8\") pod \"1b496ccf-4298-4759-aacf-e115101cb90d\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.672370 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-config\") pod \"1b496ccf-4298-4759-aacf-e115101cb90d\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.672445 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-nb\") pod \"1b496ccf-4298-4759-aacf-e115101cb90d\" (UID: \"1b496ccf-4298-4759-aacf-e115101cb90d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.673156 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.680792 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.681861 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b496ccf-4298-4759-aacf-e115101cb90d-kube-api-access-qszf8" (OuterVolumeSpecName: "kube-api-access-qszf8") pod "1b496ccf-4298-4759-aacf-e115101cb90d" (UID: "1b496ccf-4298-4759-aacf-e115101cb90d"). InnerVolumeSpecName "kube-api-access-qszf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.739737 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b496ccf-4298-4759-aacf-e115101cb90d" (UID: "1b496ccf-4298-4759-aacf-e115101cb90d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.757764 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b496ccf-4298-4759-aacf-e115101cb90d" (UID: "1b496ccf-4298-4759-aacf-e115101cb90d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.771702 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b496ccf-4298-4759-aacf-e115101cb90d" (UID: "1b496ccf-4298-4759-aacf-e115101cb90d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773362 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34551f6d-235f-4fee-939a-450195242f3e-logs\") pod \"34551f6d-235f-4fee-939a-450195242f3e\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773401 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c90616-a536-4fab-911b-2fd02c52ef9d-logs\") pod \"e4c90616-a536-4fab-911b-2fd02c52ef9d\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773445 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-config-data\") pod \"e4c90616-a536-4fab-911b-2fd02c52ef9d\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773464 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-scripts\") pod \"34551f6d-235f-4fee-939a-450195242f3e\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773497 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-custom-prometheus-ca\") pod \"e4c90616-a536-4fab-911b-2fd02c52ef9d\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773530 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-config-data\") pod \"34551f6d-235f-4fee-939a-450195242f3e\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773563 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/34551f6d-235f-4fee-939a-450195242f3e-horizon-secret-key\") pod \"34551f6d-235f-4fee-939a-450195242f3e\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773586 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-combined-ca-bundle\") pod \"e4c90616-a536-4fab-911b-2fd02c52ef9d\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773616 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25srx\" (UniqueName: \"kubernetes.io/projected/34551f6d-235f-4fee-939a-450195242f3e-kube-api-access-25srx\") pod \"34551f6d-235f-4fee-939a-450195242f3e\" (UID: \"34551f6d-235f-4fee-939a-450195242f3e\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773666 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzdsp\" (UniqueName: \"kubernetes.io/projected/e4c90616-a536-4fab-911b-2fd02c52ef9d-kube-api-access-zzdsp\") pod \"e4c90616-a536-4fab-911b-2fd02c52ef9d\" (UID: \"e4c90616-a536-4fab-911b-2fd02c52ef9d\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773704 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34551f6d-235f-4fee-939a-450195242f3e-logs" (OuterVolumeSpecName: "logs") pod "34551f6d-235f-4fee-939a-450195242f3e" (UID: "34551f6d-235f-4fee-939a-450195242f3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773750 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4c90616-a536-4fab-911b-2fd02c52ef9d-logs" (OuterVolumeSpecName: "logs") pod "e4c90616-a536-4fab-911b-2fd02c52ef9d" (UID: "e4c90616-a536-4fab-911b-2fd02c52ef9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.773998 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.774019 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34551f6d-235f-4fee-939a-450195242f3e-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.774030 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4c90616-a536-4fab-911b-2fd02c52ef9d-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.774042 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.774050 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.774059 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qszf8\" (UniqueName: \"kubernetes.io/projected/1b496ccf-4298-4759-aacf-e115101cb90d-kube-api-access-qszf8\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.774901 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-config-data" (OuterVolumeSpecName: "config-data") pod "34551f6d-235f-4fee-939a-450195242f3e" (UID: "34551f6d-235f-4fee-939a-450195242f3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.775104 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-scripts" (OuterVolumeSpecName: "scripts") pod "34551f6d-235f-4fee-939a-450195242f3e" (UID: "34551f6d-235f-4fee-939a-450195242f3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.781272 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34551f6d-235f-4fee-939a-450195242f3e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "34551f6d-235f-4fee-939a-450195242f3e" (UID: "34551f6d-235f-4fee-939a-450195242f3e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.782402 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34551f6d-235f-4fee-939a-450195242f3e-kube-api-access-25srx" (OuterVolumeSpecName: "kube-api-access-25srx") pod "34551f6d-235f-4fee-939a-450195242f3e" (UID: "34551f6d-235f-4fee-939a-450195242f3e"). InnerVolumeSpecName "kube-api-access-25srx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.783452 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c90616-a536-4fab-911b-2fd02c52ef9d-kube-api-access-zzdsp" (OuterVolumeSpecName: "kube-api-access-zzdsp") pod "e4c90616-a536-4fab-911b-2fd02c52ef9d" (UID: "e4c90616-a536-4fab-911b-2fd02c52ef9d"). InnerVolumeSpecName "kube-api-access-zzdsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.793590 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-config" (OuterVolumeSpecName: "config") pod "1b496ccf-4298-4759-aacf-e115101cb90d" (UID: "1b496ccf-4298-4759-aacf-e115101cb90d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.793920 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b496ccf-4298-4759-aacf-e115101cb90d" (UID: "1b496ccf-4298-4759-aacf-e115101cb90d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.800866 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4c90616-a536-4fab-911b-2fd02c52ef9d" (UID: "e4c90616-a536-4fab-911b-2fd02c52ef9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.811833 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e4c90616-a536-4fab-911b-2fd02c52ef9d" (UID: "e4c90616-a536-4fab-911b-2fd02c52ef9d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.822145 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.828821 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.833550 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-config-data" (OuterVolumeSpecName: "config-data") pod "e4c90616-a536-4fab-911b-2fd02c52ef9d" (UID: "e4c90616-a536-4fab-911b-2fd02c52ef9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.874702 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-config-data\") pod \"a3b9e706-7b47-409b-91a9-8457dfa315f1\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.874766 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b9e706-7b47-409b-91a9-8457dfa315f1-logs\") pod \"a3b9e706-7b47-409b-91a9-8457dfa315f1\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.874816 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlsqs\" (UniqueName: \"kubernetes.io/projected/386f7c41-cb62-4ff1-bef7-11e4e8b14707-kube-api-access-jlsqs\") pod \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.874873 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-scripts\") pod \"a3b9e706-7b47-409b-91a9-8457dfa315f1\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.874929 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msfm5\" (UniqueName: \"kubernetes.io/projected/a3b9e706-7b47-409b-91a9-8457dfa315f1-kube-api-access-msfm5\") pod \"a3b9e706-7b47-409b-91a9-8457dfa315f1\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875013 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3b9e706-7b47-409b-91a9-8457dfa315f1-horizon-secret-key\") pod \"a3b9e706-7b47-409b-91a9-8457dfa315f1\" (UID: \"a3b9e706-7b47-409b-91a9-8457dfa315f1\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875055 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-combined-ca-bundle\") pod \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875077 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-config\") pod \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\" (UID: \"386f7c41-cb62-4ff1-bef7-11e4e8b14707\") " Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875192 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b9e706-7b47-409b-91a9-8457dfa315f1-logs" (OuterVolumeSpecName: "logs") pod "a3b9e706-7b47-409b-91a9-8457dfa315f1" (UID: "a3b9e706-7b47-409b-91a9-8457dfa315f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875406 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-config-data" (OuterVolumeSpecName: "config-data") pod "a3b9e706-7b47-409b-91a9-8457dfa315f1" (UID: "a3b9e706-7b47-409b-91a9-8457dfa315f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875674 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875694 4886 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/34551f6d-235f-4fee-939a-450195242f3e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875705 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875714 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875724 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25srx\" (UniqueName: \"kubernetes.io/projected/34551f6d-235f-4fee-939a-450195242f3e-kube-api-access-25srx\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875732 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b9e706-7b47-409b-91a9-8457dfa315f1-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875740 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b496ccf-4298-4759-aacf-e115101cb90d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875748 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzdsp\" (UniqueName: \"kubernetes.io/projected/e4c90616-a536-4fab-911b-2fd02c52ef9d-kube-api-access-zzdsp\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875757 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875764 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875772 4886 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e4c90616-a536-4fab-911b-2fd02c52ef9d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875780 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34551f6d-235f-4fee-939a-450195242f3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.875977 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-scripts" (OuterVolumeSpecName: "scripts") pod "a3b9e706-7b47-409b-91a9-8457dfa315f1" (UID: "a3b9e706-7b47-409b-91a9-8457dfa315f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.878380 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386f7c41-cb62-4ff1-bef7-11e4e8b14707-kube-api-access-jlsqs" (OuterVolumeSpecName: "kube-api-access-jlsqs") pod "386f7c41-cb62-4ff1-bef7-11e4e8b14707" (UID: "386f7c41-cb62-4ff1-bef7-11e4e8b14707"). InnerVolumeSpecName "kube-api-access-jlsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.878771 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b9e706-7b47-409b-91a9-8457dfa315f1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a3b9e706-7b47-409b-91a9-8457dfa315f1" (UID: "a3b9e706-7b47-409b-91a9-8457dfa315f1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.879679 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b9e706-7b47-409b-91a9-8457dfa315f1-kube-api-access-msfm5" (OuterVolumeSpecName: "kube-api-access-msfm5") pod "a3b9e706-7b47-409b-91a9-8457dfa315f1" (UID: "a3b9e706-7b47-409b-91a9-8457dfa315f1"). InnerVolumeSpecName "kube-api-access-msfm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.899313 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "386f7c41-cb62-4ff1-bef7-11e4e8b14707" (UID: "386f7c41-cb62-4ff1-bef7-11e4e8b14707"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.905418 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-config" (OuterVolumeSpecName: "config") pod "386f7c41-cb62-4ff1-bef7-11e4e8b14707" (UID: "386f7c41-cb62-4ff1-bef7-11e4e8b14707"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.976799 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlsqs\" (UniqueName: \"kubernetes.io/projected/386f7c41-cb62-4ff1-bef7-11e4e8b14707-kube-api-access-jlsqs\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.976841 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3b9e706-7b47-409b-91a9-8457dfa315f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.976852 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msfm5\" (UniqueName: \"kubernetes.io/projected/a3b9e706-7b47-409b-91a9-8457dfa315f1-kube-api-access-msfm5\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.976862 4886 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3b9e706-7b47-409b-91a9-8457dfa315f1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.976873 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:58 crc kubenswrapper[4886]: I0314 08:49:58.976880 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/386f7c41-cb62-4ff1-bef7-11e4e8b14707-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.302602 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: i/o timeout" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.302706 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.591612 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dfdc8544f-8k2lm" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.592532 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57f8bb66ff-scjrm" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.592710 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.593202 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-kz2x6" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.593774 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sbjqr" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.631198 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.649005 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663146 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.663587 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api-log" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663610 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api-log" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.663626 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f7c41-cb62-4ff1-bef7-11e4e8b14707" containerName="neutron-db-sync" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663632 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f7c41-cb62-4ff1-bef7-11e4e8b14707" containerName="neutron-db-sync" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.663646 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="init" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663653 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="init" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.663664 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663672 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.663697 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663704 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663880 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663913 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="386f7c41-cb62-4ff1-bef7-11e4e8b14707" containerName="neutron-db-sync" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663929 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api-log" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.663944 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" containerName="dnsmasq-dns" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.664880 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.668593 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.691555 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dfdc8544f-8k2lm"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.702404 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.715235 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6dfdc8544f-8k2lm"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.723360 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-kz2x6"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.730876 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-kz2x6"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.748053 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57f8bb66ff-scjrm"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.753846 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57f8bb66ff-scjrm"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.795313 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-logs\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.795378 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p29gb\" (UniqueName: \"kubernetes.io/projected/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-kube-api-access-p29gb\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.795445 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.795474 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.795497 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-config-data\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.900952 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-logs\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.901037 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p29gb\" (UniqueName: \"kubernetes.io/projected/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-kube-api-access-p29gb\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.901202 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.901262 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.901309 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-config-data\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.901834 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-logs\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.908625 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.910006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.910696 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-config-data\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.937882 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p29gb\" (UniqueName: \"kubernetes.io/projected/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-kube-api-access-p29gb\") pod \"watcher-api-0\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.953501 4886 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.953652 4886 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4cjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6frh2_openstack(14718224-eaad-4caf-b13b-a60a9c2a9460): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 14 08:49:59 crc kubenswrapper[4886]: E0314 08:49:59.954903 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6frh2" podUID="14718224-eaad-4caf-b13b-a60a9c2a9460" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.986530 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.995382 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-lr4pt"] Mar 14 08:49:59 crc kubenswrapper[4886]: I0314 08:49:59.997012 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.002882 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.002968 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.002996 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.003016 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwhc\" (UniqueName: \"kubernetes.io/projected/ce19460f-e7c6-4212-8e85-ca624f694ac9-kube-api-access-pbwhc\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.003046 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-config\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.003094 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.010787 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-lr4pt"] Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.107670 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.107726 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.107748 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwhc\" (UniqueName: \"kubernetes.io/projected/ce19460f-e7c6-4212-8e85-ca624f694ac9-kube-api-access-pbwhc\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.107800 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-config\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.107906 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.108025 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.108973 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.109555 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.110051 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.115185 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-config\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.115978 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.152387 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwhc\" (UniqueName: \"kubernetes.io/projected/ce19460f-e7c6-4212-8e85-ca624f694ac9-kube-api-access-pbwhc\") pod \"dnsmasq-dns-5ccc5c4795-lr4pt\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.162554 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557970-k52mf"] Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.163924 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.167703 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.167908 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.169209 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.171504 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-k52mf"] Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.258067 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64796d9fb-7nw8p"] Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.261089 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.268472 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64796d9fb-7nw8p"] Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.270396 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.270542 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sx7r8" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.270665 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.279224 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.316555 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7g5\" (UniqueName: \"kubernetes.io/projected/8bba36a0-936a-4da3-a23b-79068d0c437c-kube-api-access-pt7g5\") pod \"auto-csr-approver-29557970-k52mf\" (UID: \"8bba36a0-936a-4da3-a23b-79068d0c437c\") " pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.374401 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.421711 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7g5\" (UniqueName: \"kubernetes.io/projected/8bba36a0-936a-4da3-a23b-79068d0c437c-kube-api-access-pt7g5\") pod \"auto-csr-approver-29557970-k52mf\" (UID: \"8bba36a0-936a-4da3-a23b-79068d0c437c\") " pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.422378 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-ovndb-tls-certs\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.422532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-config\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.422672 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-combined-ca-bundle\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.423942 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2bc\" (UniqueName: \"kubernetes.io/projected/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-kube-api-access-xx2bc\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.424027 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-httpd-config\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.450808 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7g5\" (UniqueName: \"kubernetes.io/projected/8bba36a0-936a-4da3-a23b-79068d0c437c-kube-api-access-pt7g5\") pod \"auto-csr-approver-29557970-k52mf\" (UID: \"8bba36a0-936a-4da3-a23b-79068d0c437c\") " pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.515414 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.526019 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-ovndb-tls-certs\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.526079 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-config\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.526114 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-combined-ca-bundle\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.526207 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2bc\" (UniqueName: \"kubernetes.io/projected/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-kube-api-access-xx2bc\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.526227 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-httpd-config\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.537355 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-config\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.538095 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-httpd-config\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.546902 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-ovndb-tls-certs\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.557721 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-combined-ca-bundle\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.576577 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2bc\" (UniqueName: \"kubernetes.io/projected/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-kube-api-access-xx2bc\") pod \"neutron-64796d9fb-7nw8p\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.580371 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66c6bc56b6-25jn4"] Mar 14 08:50:00 crc kubenswrapper[4886]: E0314 08:50:00.602949 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6frh2" podUID="14718224-eaad-4caf-b13b-a60a9c2a9460" Mar 14 08:50:00 crc kubenswrapper[4886]: I0314 08:50:00.619207 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.257616 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7769c88f5b-8gr9x"] Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.474053 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b496ccf-4298-4759-aacf-e115101cb90d" path="/var/lib/kubelet/pods/1b496ccf-4298-4759-aacf-e115101cb90d/volumes" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.475160 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34551f6d-235f-4fee-939a-450195242f3e" path="/var/lib/kubelet/pods/34551f6d-235f-4fee-939a-450195242f3e/volumes" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.475603 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b9e706-7b47-409b-91a9-8457dfa315f1" path="/var/lib/kubelet/pods/a3b9e706-7b47-409b-91a9-8457dfa315f1/volumes" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.476046 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" path="/var/lib/kubelet/pods/e4c90616-a536-4fab-911b-2fd02c52ef9d/volumes" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.479281 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vwjh5"] Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.519555 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.571033 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-lr4pt"] Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.616011 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7496c4d65c-dg8pn" event={"ID":"6624fe29-e15e-4474-a2d9-37489c04e1b6","Type":"ContainerStarted","Data":"135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.634308 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7769c88f5b-8gr9x" event={"ID":"46272ed5-a9f5-45eb-b9ba-58289ed822a7","Type":"ContainerStarted","Data":"1d62f83700d4d2c8be72d35c0cf2b0a341f2eb50465ba76149d84bfd84549a77"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.654937 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b41275dd-03d8-40b8-9f06-0dc67ecb12e6","Type":"ContainerStarted","Data":"e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.696477 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=14.628643149 podStartE2EDuration="34.696457569s" podCreationTimestamp="2026-03-14 08:49:27 +0000 UTC" firstStartedPulling="2026-03-14 08:49:29.952108181 +0000 UTC m=+1305.200559818" lastFinishedPulling="2026-03-14 08:49:50.019922601 +0000 UTC m=+1325.268374238" observedRunningTime="2026-03-14 08:50:01.676521875 +0000 UTC m=+1336.924973512" watchObservedRunningTime="2026-03-14 08:50:01.696457569 +0000 UTC m=+1336.944909196" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.699552 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"54e651d7-c7c1-46d6-9097-f769d9e64d4e","Type":"ContainerStarted","Data":"878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.720550 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-log" containerID="cri-o://ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9" gracePeriod=30 Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.721070 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-httpd" containerID="cri-o://1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781" gracePeriod=30 Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.721191 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8a040e4-11ae-4ffb-94dd-53e5f9962d53","Type":"ContainerStarted","Data":"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.723874 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vwjh5" event={"ID":"27d82e0e-40f0-45e0-b92f-df553a24bc5b","Type":"ContainerStarted","Data":"45a14a85f0e4f5733e686d5f1a9a0fef4b4d91ae7c901514674aa6ed08872a70"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.725545 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ctl2l" event={"ID":"29f258e4-6012-4807-95a6-cce9ee5af3d8","Type":"ContainerStarted","Data":"f09c8a9767fae0dfb3082349002eaeac96b929963e09bcf10f2f0d8effd2be84"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.728096 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=14.35407688 podStartE2EDuration="34.728080617s" podCreationTimestamp="2026-03-14 08:49:27 +0000 UTC" firstStartedPulling="2026-03-14 08:49:29.647474257 +0000 UTC m=+1304.895925894" lastFinishedPulling="2026-03-14 08:49:50.021477984 +0000 UTC m=+1325.269929631" observedRunningTime="2026-03-14 08:50:01.717600396 +0000 UTC m=+1336.966052043" watchObservedRunningTime="2026-03-14 08:50:01.728080617 +0000 UTC m=+1336.976532254" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.730362 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c6bc56b6-25jn4" event={"ID":"3f8100ac-c606-4eb3-afd6-07be9de44f42","Type":"ContainerStarted","Data":"cdf73c46710d22075f2126e643228c7cb08bbeea14c43229d6d437e527c964a0"} Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.761354 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=33.761333271 podStartE2EDuration="33.761333271s" podCreationTimestamp="2026-03-14 08:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:01.74651951 +0000 UTC m=+1336.994971147" watchObservedRunningTime="2026-03-14 08:50:01.761333271 +0000 UTC m=+1337.009784908" Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.788290 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-k52mf"] Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.803497 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:01 crc kubenswrapper[4886]: I0314 08:50:01.812634 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ctl2l" podStartSLOduration=5.926479604 podStartE2EDuration="33.812614506s" podCreationTimestamp="2026-03-14 08:49:28 +0000 UTC" firstStartedPulling="2026-03-14 08:49:30.648974413 +0000 UTC m=+1305.897426050" lastFinishedPulling="2026-03-14 08:49:58.535109315 +0000 UTC m=+1333.783560952" observedRunningTime="2026-03-14 08:50:01.791405437 +0000 UTC m=+1337.039857074" watchObservedRunningTime="2026-03-14 08:50:01.812614506 +0000 UTC m=+1337.061066133" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.037568 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64796d9fb-7nw8p"] Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.429848 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.594200 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-public-tls-certs\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.594707 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-combined-ca-bundle\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.594782 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-config-data\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.594839 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-httpd-run\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.594947 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-scripts\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.594976 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.595028 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-logs\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.595065 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rwdp\" (UniqueName: \"kubernetes.io/projected/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-kube-api-access-8rwdp\") pod \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\" (UID: \"e8a040e4-11ae-4ffb-94dd-53e5f9962d53\") " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.601133 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.602910 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-kube-api-access-8rwdp" (OuterVolumeSpecName: "kube-api-access-8rwdp") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "kube-api-access-8rwdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.603091 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-logs" (OuterVolumeSpecName: "logs") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.616369 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-scripts" (OuterVolumeSpecName: "scripts") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.620379 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.670271 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.701459 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.701506 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.701518 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.701526 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rwdp\" (UniqueName: \"kubernetes.io/projected/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-kube-api-access-8rwdp\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.701537 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.701547 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.709718 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-config-data" (OuterVolumeSpecName: "config-data") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.710896 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e8a040e4-11ae-4ffb-94dd-53e5f9962d53" (UID: "e8a040e4-11ae-4ffb-94dd-53e5f9962d53"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.722822 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.743088 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vwjh5" event={"ID":"27d82e0e-40f0-45e0-b92f-df553a24bc5b","Type":"ContainerStarted","Data":"5bbd87714301a05e3600057a773994a2e440bb56d6be08fed15027ac487ffe88"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.749402 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-k52mf" event={"ID":"8bba36a0-936a-4da3-a23b-79068d0c437c","Type":"ContainerStarted","Data":"499e37063712832539c4e069eeb8a574790cddcf422dc2c12a79f8882d1eaba4"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.767403 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerStarted","Data":"c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.768358 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vwjh5" podStartSLOduration=19.76834372 podStartE2EDuration="19.76834372s" podCreationTimestamp="2026-03-14 08:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:02.761618193 +0000 UTC m=+1338.010069820" watchObservedRunningTime="2026-03-14 08:50:02.76834372 +0000 UTC m=+1338.016795357" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.782647 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c6bc56b6-25jn4" event={"ID":"3f8100ac-c606-4eb3-afd6-07be9de44f42","Type":"ContainerStarted","Data":"d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.810055 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.810089 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a040e4-11ae-4ffb-94dd-53e5f9962d53-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.810098 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.813197 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3","Type":"ContainerStarted","Data":"ef351e23ca052eae7bcf7ffb583b52fe22b4f34d2d46be687a34a68e47050295"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.813240 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3","Type":"ContainerStarted","Data":"05fa7c629ce0013359949c6fdc867377ad78ffa4750529f3fedc951db8803b5e"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.839735 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ede23dd-82b8-42ef-bdcd-d4be5637e457","Type":"ContainerStarted","Data":"c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.839888 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-log" containerID="cri-o://5d4f33cc733c4c780428838b0c1aa93986d723691a5fb2a33eb68f42aa298bee" gracePeriod=30 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.840351 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-httpd" containerID="cri-o://c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588" gracePeriod=30 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.892076 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c5789dc8f-4vv5f"] Mar 14 08:50:02 crc kubenswrapper[4886]: E0314 08:50:02.892783 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-log" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.892800 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-log" Mar 14 08:50:02 crc kubenswrapper[4886]: E0314 08:50:02.892831 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-httpd" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.892838 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-httpd" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.893003 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-httpd" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.893025 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerName="glance-log" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.894022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wtzc" event={"ID":"e521aeb3-adb2-4042-ad11-33d749d5506b","Type":"ContainerStarted","Data":"10d580258be851ff0edd037a92b42e6ce275f0593a0f37fac83cb417b50e058f"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.894113 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.902438 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c5789dc8f-4vv5f"] Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.905688 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=34.905674856 podStartE2EDuration="34.905674856s" podCreationTimestamp="2026-03-14 08:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:02.868544834 +0000 UTC m=+1338.116996461" watchObservedRunningTime="2026-03-14 08:50:02.905674856 +0000 UTC m=+1338.154126493" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.909263 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.909565 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.911193 4886 generic.go:334] "Generic (PLEG): container finished" podID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerID="9e5189e9f9707ab0527e23bf2ad563e7c5aecd0dbe461eccf1a043d62adccc5a" exitCode=0 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.911280 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" event={"ID":"ce19460f-e7c6-4212-8e85-ca624f694ac9","Type":"ContainerDied","Data":"9e5189e9f9707ab0527e23bf2ad563e7c5aecd0dbe461eccf1a043d62adccc5a"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.911325 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" event={"ID":"ce19460f-e7c6-4212-8e85-ca624f694ac9","Type":"ContainerStarted","Data":"fa9541d5ec746697a7747a0ea618061ff36a22abdda6d9bd74c47edcad692aff"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918268 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-ovndb-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918320 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-internal-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918461 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-config\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918677 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-httpd-config\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918710 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-combined-ca-bundle\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918744 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-public-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.918771 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xlzx\" (UniqueName: \"kubernetes.io/projected/9f9329a3-9a35-49a2-86ce-435b98d280f3-kube-api-access-2xlzx\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.931626 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7496c4d65c-dg8pn" event={"ID":"6624fe29-e15e-4474-a2d9-37489c04e1b6","Type":"ContainerStarted","Data":"5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.931793 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7496c4d65c-dg8pn" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon-log" containerID="cri-o://135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14" gracePeriod=30 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.931882 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7496c4d65c-dg8pn" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon" containerID="cri-o://5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63" gracePeriod=30 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.947309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7769c88f5b-8gr9x" event={"ID":"46272ed5-a9f5-45eb-b9ba-58289ed822a7","Type":"ContainerStarted","Data":"8360ce1a803348b9b0110a40d852d7a23d0c7dc5fe8d41b5bf28451024d985c4"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.971098 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8wtzc" podStartSLOduration=6.936144777 podStartE2EDuration="34.971079743s" podCreationTimestamp="2026-03-14 08:49:28 +0000 UTC" firstStartedPulling="2026-03-14 08:49:30.475134783 +0000 UTC m=+1305.723586420" lastFinishedPulling="2026-03-14 08:49:58.510069749 +0000 UTC m=+1333.758521386" observedRunningTime="2026-03-14 08:50:02.925703322 +0000 UTC m=+1338.174154959" watchObservedRunningTime="2026-03-14 08:50:02.971079743 +0000 UTC m=+1338.219531380" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.980979 4886 generic.go:334] "Generic (PLEG): container finished" podID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerID="1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781" exitCode=143 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.981026 4886 generic.go:334] "Generic (PLEG): container finished" podID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" containerID="ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9" exitCode=143 Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.981103 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8a040e4-11ae-4ffb-94dd-53e5f9962d53","Type":"ContainerDied","Data":"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.981156 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8a040e4-11ae-4ffb-94dd-53e5f9962d53","Type":"ContainerDied","Data":"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.981169 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8a040e4-11ae-4ffb-94dd-53e5f9962d53","Type":"ContainerDied","Data":"fa25d003d5f7a360ae3344d569bb0943d575c486ed5fbea5939a1f6e63b4c4fa"} Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.981190 4886 scope.go:117] "RemoveContainer" containerID="1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.981372 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:50:02 crc kubenswrapper[4886]: I0314 08:50:02.993594 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7496c4d65c-dg8pn" podStartSLOduration=7.312830054 podStartE2EDuration="31.993576448s" podCreationTimestamp="2026-03-14 08:49:31 +0000 UTC" firstStartedPulling="2026-03-14 08:49:36.037479865 +0000 UTC m=+1311.285931502" lastFinishedPulling="2026-03-14 08:50:00.718226259 +0000 UTC m=+1335.966677896" observedRunningTime="2026-03-14 08:50:02.980668139 +0000 UTC m=+1338.229119776" watchObservedRunningTime="2026-03-14 08:50:02.993576448 +0000 UTC m=+1338.242028085" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.003443 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64796d9fb-7nw8p" event={"ID":"4a9ffec0-3aa9-46a6-87b9-aadc1021683c","Type":"ContainerStarted","Data":"73d43bea51b4f69093e0094852fed6e07a712fa8984fafd978f07dad7e19c2f5"} Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.003506 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64796d9fb-7nw8p" event={"ID":"4a9ffec0-3aa9-46a6-87b9-aadc1021683c","Type":"ContainerStarted","Data":"4b32ab325479c6c85640a603a65bf2a4fd21e761a66b0f89c9e174e3f823d63d"} Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021231 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-ovndb-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021284 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-internal-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021383 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-config\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021550 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-httpd-config\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021574 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-combined-ca-bundle\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021603 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-public-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.021624 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xlzx\" (UniqueName: \"kubernetes.io/projected/9f9329a3-9a35-49a2-86ce-435b98d280f3-kube-api-access-2xlzx\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.044729 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-internal-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.046304 4886 scope.go:117] "RemoveContainer" containerID="ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.059538 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-httpd-config\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.066801 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.067250 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xlzx\" (UniqueName: \"kubernetes.io/projected/9f9329a3-9a35-49a2-86ce-435b98d280f3-kube-api-access-2xlzx\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.078014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-ovndb-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.078393 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-config\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.080244 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.082840 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-public-tls-certs\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.086294 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-combined-ca-bundle\") pod \"neutron-7c5789dc8f-4vv5f\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.099410 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.101064 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.109045 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.109483 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.164205 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224483 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrv4\" (UniqueName: \"kubernetes.io/projected/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-kube-api-access-6nrv4\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224594 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224640 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224675 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224706 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224741 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-logs\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.224772 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.243248 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.261519 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.325919 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-logs\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.325985 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326024 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326076 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrv4\" (UniqueName: \"kubernetes.io/projected/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-kube-api-access-6nrv4\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326098 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326152 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326179 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326202 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.326616 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.328232 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-logs\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.334640 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.336220 4886 scope.go:117] "RemoveContainer" containerID="1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781" Mar 14 08:50:03 crc kubenswrapper[4886]: E0314 08:50:03.336988 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781\": container with ID starting with 1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781 not found: ID does not exist" containerID="1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.337045 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781"} err="failed to get container status \"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781\": rpc error: code = NotFound desc = could not find container \"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781\": container with ID starting with 1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781 not found: ID does not exist" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.337072 4886 scope.go:117] "RemoveContainer" containerID="ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.337437 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: E0314 08:50:03.338387 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9\": container with ID starting with ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9 not found: ID does not exist" containerID="ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.338414 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9"} err="failed to get container status \"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9\": rpc error: code = NotFound desc = could not find container \"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9\": container with ID starting with ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9 not found: ID does not exist" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.338429 4886 scope.go:117] "RemoveContainer" containerID="1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.341764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.346298 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781"} err="failed to get container status \"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781\": rpc error: code = NotFound desc = could not find container \"1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781\": container with ID starting with 1c6151b40fec6c770685aad3a63cb9601dfeaa564665e6a9f5399e0335317781 not found: ID does not exist" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.346341 4886 scope.go:117] "RemoveContainer" containerID="ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.347556 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.352086 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.352921 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9"} err="failed to get container status \"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9\": rpc error: code = NotFound desc = could not find container \"ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9\": container with ID starting with ea4cbf651a05652941e687ad176520ffc9e5ebb127476b8d7e47edfee59ef5d9 not found: ID does not exist" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.359882 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrv4\" (UniqueName: \"kubernetes.io/projected/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-kube-api-access-6nrv4\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.394409 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.478447 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a040e4-11ae-4ffb-94dd-53e5f9962d53" path="/var/lib/kubelet/pods/e8a040e4-11ae-4ffb-94dd-53e5f9962d53/volumes" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.498695 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:50:03 crc kubenswrapper[4886]: I0314 08:50:03.589133 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e4c90616-a536-4fab-911b-2fd02c52ef9d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.158:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.039370 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3","Type":"ContainerStarted","Data":"33dd9fa372fec5d26f92e9ebc9abae7dab0da9bcc9d7ab17f90552563afe0963"} Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.042050 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.094952 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c6bc56b6-25jn4" event={"ID":"3f8100ac-c606-4eb3-afd6-07be9de44f42","Type":"ContainerStarted","Data":"844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57"} Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.106470 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=5.106443258 podStartE2EDuration="5.106443258s" podCreationTimestamp="2026-03-14 08:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:04.065630504 +0000 UTC m=+1339.314082141" watchObservedRunningTime="2026-03-14 08:50:04.106443258 +0000 UTC m=+1339.354894885" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.123665 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ede23dd-82b8-42ef-bdcd-d4be5637e457","Type":"ContainerDied","Data":"5d4f33cc733c4c780428838b0c1aa93986d723691a5fb2a33eb68f42aa298bee"} Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.123109 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerID="5d4f33cc733c4c780428838b0c1aa93986d723691a5fb2a33eb68f42aa298bee" exitCode=143 Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.172111 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66c6bc56b6-25jn4" podStartSLOduration=27.171807574 podStartE2EDuration="27.171807574s" podCreationTimestamp="2026-03-14 08:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:04.115932691 +0000 UTC m=+1339.364384358" watchObservedRunningTime="2026-03-14 08:50:04.171807574 +0000 UTC m=+1339.420259211" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.220230 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7769c88f5b-8gr9x" event={"ID":"46272ed5-a9f5-45eb-b9ba-58289ed822a7","Type":"ContainerStarted","Data":"f9d7eed3ece0210aaeebdd306334f5a367d9436633835218c1103db47f2f4178"} Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.257343 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7769c88f5b-8gr9x" podStartSLOduration=27.257317929 podStartE2EDuration="27.257317929s" podCreationTimestamp="2026-03-14 08:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:04.247775914 +0000 UTC m=+1339.496227551" watchObservedRunningTime="2026-03-14 08:50:04.257317929 +0000 UTC m=+1339.505769566" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.258060 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64796d9fb-7nw8p" event={"ID":"4a9ffec0-3aa9-46a6-87b9-aadc1021683c","Type":"ContainerStarted","Data":"3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a"} Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.259322 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.307039 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64796d9fb-7nw8p" podStartSLOduration=4.307018049 podStartE2EDuration="4.307018049s" podCreationTimestamp="2026-03-14 08:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:04.285627985 +0000 UTC m=+1339.534079632" watchObservedRunningTime="2026-03-14 08:50:04.307018049 +0000 UTC m=+1339.555469686" Mar 14 08:50:04 crc kubenswrapper[4886]: E0314 08:50:04.676969 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ede23dd_82b8_42ef_bdcd_d4be5637e457.slice/crio-conmon-c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ede23dd_82b8_42ef_bdcd_d4be5637e457.slice/crio-c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588.scope\": RecentStats: unable to find data in memory cache]" Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.729794 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:50:04 crc kubenswrapper[4886]: I0314 08:50:04.987409 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 14 08:50:05 crc kubenswrapper[4886]: I0314 08:50:05.267158 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e6e3f18-a42d-4d01-8df0-6dfc736974fc","Type":"ContainerStarted","Data":"8724e40188eace52b94c6fae5458724d2da8f36a9f87806d291011b9defc7745"} Mar 14 08:50:05 crc kubenswrapper[4886]: I0314 08:50:05.269310 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerID="c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588" exitCode=0 Mar 14 08:50:05 crc kubenswrapper[4886]: I0314 08:50:05.270441 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ede23dd-82b8-42ef-bdcd-d4be5637e457","Type":"ContainerDied","Data":"c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588"} Mar 14 08:50:05 crc kubenswrapper[4886]: I0314 08:50:05.459071 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c5789dc8f-4vv5f"] Mar 14 08:50:06 crc kubenswrapper[4886]: I0314 08:50:06.278827 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:50:06 crc kubenswrapper[4886]: I0314 08:50:06.616753 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 14 08:50:07 crc kubenswrapper[4886]: W0314 08:50:07.747662 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f9329a3_9a35_49a2_86ce_435b98d280f3.slice/crio-bdde15edc1885ed8336cb76b4a687705dffefc5b3ad001c2ed65a520ba273633 WatchSource:0}: Error finding container bdde15edc1885ed8336cb76b4a687705dffefc5b3ad001c2ed65a520ba273633: Status 404 returned error can't find the container with id bdde15edc1885ed8336cb76b4a687705dffefc5b3ad001c2ed65a520ba273633 Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.145569 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.145903 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.183038 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.216826 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.239375 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.266862 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.268105 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.290207 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.300471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" event={"ID":"ce19460f-e7c6-4212-8e85-ca624f694ac9","Type":"ContainerStarted","Data":"6c0f1ceac2ede1fec8ce759945df6b0df7eb129151dd8a095858b5c03ed6bf13"} Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.301636 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5789dc8f-4vv5f" event={"ID":"9f9329a3-9a35-49a2-86ce-435b98d280f3","Type":"ContainerStarted","Data":"bdde15edc1885ed8336cb76b4a687705dffefc5b3ad001c2ed65a520ba273633"} Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.302095 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.338988 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.340380 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.398495 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:50:08 crc kubenswrapper[4886]: I0314 08:50:08.421703 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.315388 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e6e3f18-a42d-4d01-8df0-6dfc736974fc","Type":"ContainerStarted","Data":"96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f"} Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.319894 4886 generic.go:334] "Generic (PLEG): container finished" podID="8bba36a0-936a-4da3-a23b-79068d0c437c" containerID="7549d59a1101a59e8fc3fd0f7cc18a420bea35c7279ccb049806789c71220744" exitCode=0 Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.320370 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-k52mf" event={"ID":"8bba36a0-936a-4da3-a23b-79068d0c437c","Type":"ContainerDied","Data":"7549d59a1101a59e8fc3fd0f7cc18a420bea35c7279ccb049806789c71220744"} Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.324857 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5789dc8f-4vv5f" event={"ID":"9f9329a3-9a35-49a2-86ce-435b98d280f3","Type":"ContainerStarted","Data":"ddfe2022d1041feed4b3664b5ffa4eac881559b738c1046c39be568571c63d8f"} Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.325332 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.353275 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" podStartSLOduration=10.353253233 podStartE2EDuration="10.353253233s" podCreationTimestamp="2026-03-14 08:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:09.34918407 +0000 UTC m=+1344.597635707" watchObservedRunningTime="2026-03-14 08:50:09.353253233 +0000 UTC m=+1344.601704870" Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.988017 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 14 08:50:09 crc kubenswrapper[4886]: I0314 08:50:09.996915 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 14 08:50:10 crc kubenswrapper[4886]: I0314 08:50:10.338980 4886 generic.go:334] "Generic (PLEG): container finished" podID="27d82e0e-40f0-45e0-b92f-df553a24bc5b" containerID="5bbd87714301a05e3600057a773994a2e440bb56d6be08fed15027ac487ffe88" exitCode=0 Mar 14 08:50:10 crc kubenswrapper[4886]: I0314 08:50:10.339077 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vwjh5" event={"ID":"27d82e0e-40f0-45e0-b92f-df553a24bc5b","Type":"ContainerDied","Data":"5bbd87714301a05e3600057a773994a2e440bb56d6be08fed15027ac487ffe88"} Mar 14 08:50:10 crc kubenswrapper[4886]: I0314 08:50:10.339709 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" containerName="watcher-decision-engine" containerID="cri-o://e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" gracePeriod=30 Mar 14 08:50:10 crc kubenswrapper[4886]: I0314 08:50:10.339850 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" containerName="watcher-applier" containerID="cri-o://878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42" gracePeriod=30 Mar 14 08:50:10 crc kubenswrapper[4886]: I0314 08:50:10.355895 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 14 08:50:11 crc kubenswrapper[4886]: I0314 08:50:11.351060 4886 generic.go:334] "Generic (PLEG): container finished" podID="29f258e4-6012-4807-95a6-cce9ee5af3d8" containerID="f09c8a9767fae0dfb3082349002eaeac96b929963e09bcf10f2f0d8effd2be84" exitCode=0 Mar 14 08:50:11 crc kubenswrapper[4886]: I0314 08:50:11.351171 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ctl2l" event={"ID":"29f258e4-6012-4807-95a6-cce9ee5af3d8","Type":"ContainerDied","Data":"f09c8a9767fae0dfb3082349002eaeac96b929963e09bcf10f2f0d8effd2be84"} Mar 14 08:50:11 crc kubenswrapper[4886]: I0314 08:50:11.359762 4886 generic.go:334] "Generic (PLEG): container finished" podID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" containerID="878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42" exitCode=0 Mar 14 08:50:11 crc kubenswrapper[4886]: I0314 08:50:11.360616 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"54e651d7-c7c1-46d6-9097-f769d9e64d4e","Type":"ContainerDied","Data":"878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42"} Mar 14 08:50:12 crc kubenswrapper[4886]: I0314 08:50:12.119939 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:50:12 crc kubenswrapper[4886]: I0314 08:50:12.378616 4886 generic.go:334] "Generic (PLEG): container finished" podID="e521aeb3-adb2-4042-ad11-33d749d5506b" containerID="10d580258be851ff0edd037a92b42e6ce275f0593a0f37fac83cb417b50e058f" exitCode=0 Mar 14 08:50:12 crc kubenswrapper[4886]: I0314 08:50:12.379206 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wtzc" event={"ID":"e521aeb3-adb2-4042-ad11-33d749d5506b","Type":"ContainerDied","Data":"10d580258be851ff0edd037a92b42e6ce275f0593a0f37fac83cb417b50e058f"} Mar 14 08:50:13 crc kubenswrapper[4886]: E0314 08:50:13.240797 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42 is running failed: container process not found" containerID="878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 08:50:13 crc kubenswrapper[4886]: E0314 08:50:13.241207 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42 is running failed: container process not found" containerID="878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 08:50:13 crc kubenswrapper[4886]: E0314 08:50:13.241485 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42 is running failed: container process not found" containerID="878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Mar 14 08:50:13 crc kubenswrapper[4886]: E0314 08:50:13.241522 4886 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" containerName="watcher-applier" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.446381 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vwjh5" event={"ID":"27d82e0e-40f0-45e0-b92f-df553a24bc5b","Type":"ContainerDied","Data":"45a14a85f0e4f5733e686d5f1a9a0fef4b4d91ae7c901514674aa6ed08872a70"} Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.446693 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a14a85f0e4f5733e686d5f1a9a0fef4b4d91ae7c901514674aa6ed08872a70" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.452264 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ctl2l" event={"ID":"29f258e4-6012-4807-95a6-cce9ee5af3d8","Type":"ContainerDied","Data":"3b56229fbbd846fd6cf20640a98455cc0efe1826ecc0aee478b1ea5b165809d3"} Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.452313 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b56229fbbd846fd6cf20640a98455cc0efe1826ecc0aee478b1ea5b165809d3" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.458786 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557970-k52mf" event={"ID":"8bba36a0-936a-4da3-a23b-79068d0c437c","Type":"ContainerDied","Data":"499e37063712832539c4e069eeb8a574790cddcf422dc2c12a79f8882d1eaba4"} Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.458834 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499e37063712832539c4e069eeb8a574790cddcf422dc2c12a79f8882d1eaba4" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.471418 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4ede23dd-82b8-42ef-bdcd-d4be5637e457","Type":"ContainerDied","Data":"bfca6d207ff3da65f25d2a68af19eb38af712746dd8b0fe2cc883e0759410c6c"} Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.471464 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfca6d207ff3da65f25d2a68af19eb38af712746dd8b0fe2cc883e0759410c6c" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.551878 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.598654 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.624958 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.639309 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ctl2l" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.685719 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-scripts\") pod \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.685833 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-fernet-keys\") pod \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.685916 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-config-data\") pod \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.685999 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-credential-keys\") pod \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.686106 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-combined-ca-bundle\") pod \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.686158 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwp58\" (UniqueName: \"kubernetes.io/projected/27d82e0e-40f0-45e0-b92f-df553a24bc5b-kube-api-access-rwp58\") pod \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\" (UID: \"27d82e0e-40f0-45e0-b92f-df553a24bc5b\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.693756 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-scripts" (OuterVolumeSpecName: "scripts") pod "27d82e0e-40f0-45e0-b92f-df553a24bc5b" (UID: "27d82e0e-40f0-45e0-b92f-df553a24bc5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.699331 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "27d82e0e-40f0-45e0-b92f-df553a24bc5b" (UID: "27d82e0e-40f0-45e0-b92f-df553a24bc5b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.703234 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "27d82e0e-40f0-45e0-b92f-df553a24bc5b" (UID: "27d82e0e-40f0-45e0-b92f-df553a24bc5b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.720403 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d82e0e-40f0-45e0-b92f-df553a24bc5b-kube-api-access-rwp58" (OuterVolumeSpecName: "kube-api-access-rwp58") pod "27d82e0e-40f0-45e0-b92f-df553a24bc5b" (UID: "27d82e0e-40f0-45e0-b92f-df553a24bc5b"). InnerVolumeSpecName "kube-api-access-rwp58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.730100 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.742404 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27d82e0e-40f0-45e0-b92f-df553a24bc5b" (UID: "27d82e0e-40f0-45e0-b92f-df553a24bc5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.750230 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-config-data" (OuterVolumeSpecName: "config-data") pod "27d82e0e-40f0-45e0-b92f-df553a24bc5b" (UID: "27d82e0e-40f0-45e0-b92f-df553a24bc5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788467 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-logs\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788527 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f258e4-6012-4807-95a6-cce9ee5af3d8-logs\") pod \"29f258e4-6012-4807-95a6-cce9ee5af3d8\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788704 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-combined-ca-bundle\") pod \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788763 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-scripts\") pod \"29f258e4-6012-4807-95a6-cce9ee5af3d8\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788790 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-config-data\") pod \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788820 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f258e4-6012-4807-95a6-cce9ee5af3d8-logs" (OuterVolumeSpecName: "logs") pod "29f258e4-6012-4807-95a6-cce9ee5af3d8" (UID: "29f258e4-6012-4807-95a6-cce9ee5af3d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788852 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7kzj\" (UniqueName: \"kubernetes.io/projected/29f258e4-6012-4807-95a6-cce9ee5af3d8-kube-api-access-t7kzj\") pod \"29f258e4-6012-4807-95a6-cce9ee5af3d8\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788882 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-logs" (OuterVolumeSpecName: "logs") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788940 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-combined-ca-bundle\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788966 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k9wg\" (UniqueName: \"kubernetes.io/projected/54e651d7-c7c1-46d6-9097-f769d9e64d4e-kube-api-access-5k9wg\") pod \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.788998 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e651d7-c7c1-46d6-9097-f769d9e64d4e-logs\") pod \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\" (UID: \"54e651d7-c7c1-46d6-9097-f769d9e64d4e\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.789021 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvhf8\" (UniqueName: \"kubernetes.io/projected/4ede23dd-82b8-42ef-bdcd-d4be5637e457-kube-api-access-hvhf8\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.789039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-httpd-run\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.789071 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-config-data\") pod \"29f258e4-6012-4807-95a6-cce9ee5af3d8\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.789088 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-combined-ca-bundle\") pod \"29f258e4-6012-4807-95a6-cce9ee5af3d8\" (UID: \"29f258e4-6012-4807-95a6-cce9ee5af3d8\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.789111 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-internal-tls-certs\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.792642 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f258e4-6012-4807-95a6-cce9ee5af3d8-kube-api-access-t7kzj" (OuterVolumeSpecName: "kube-api-access-t7kzj") pod "29f258e4-6012-4807-95a6-cce9ee5af3d8" (UID: "29f258e4-6012-4807-95a6-cce9ee5af3d8"). InnerVolumeSpecName "kube-api-access-t7kzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.793354 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-scripts" (OuterVolumeSpecName: "scripts") pod "29f258e4-6012-4807-95a6-cce9ee5af3d8" (UID: "29f258e4-6012-4807-95a6-cce9ee5af3d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.797826 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ede23dd-82b8-42ef-bdcd-d4be5637e457-kube-api-access-hvhf8" (OuterVolumeSpecName: "kube-api-access-hvhf8") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "kube-api-access-hvhf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.802923 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-config-data\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.802969 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-scripts\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.802990 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\" (UID: \"4ede23dd-82b8-42ef-bdcd-d4be5637e457\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803028 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt7g5\" (UniqueName: \"kubernetes.io/projected/8bba36a0-936a-4da3-a23b-79068d0c437c-kube-api-access-pt7g5\") pod \"8bba36a0-936a-4da3-a23b-79068d0c437c\" (UID: \"8bba36a0-936a-4da3-a23b-79068d0c437c\") " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803703 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803719 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7kzj\" (UniqueName: \"kubernetes.io/projected/29f258e4-6012-4807-95a6-cce9ee5af3d8-kube-api-access-t7kzj\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803729 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803738 4886 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803748 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvhf8\" (UniqueName: \"kubernetes.io/projected/4ede23dd-82b8-42ef-bdcd-d4be5637e457-kube-api-access-hvhf8\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803757 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803765 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwp58\" (UniqueName: \"kubernetes.io/projected/27d82e0e-40f0-45e0-b92f-df553a24bc5b-kube-api-access-rwp58\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803774 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803782 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29f258e4-6012-4807-95a6-cce9ee5af3d8-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803790 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27d82e0e-40f0-45e0-b92f-df553a24bc5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.803798 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.805214 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.805965 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e651d7-c7c1-46d6-9097-f769d9e64d4e-logs" (OuterVolumeSpecName: "logs") pod "54e651d7-c7c1-46d6-9097-f769d9e64d4e" (UID: "54e651d7-c7c1-46d6-9097-f769d9e64d4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.816699 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e651d7-c7c1-46d6-9097-f769d9e64d4e-kube-api-access-5k9wg" (OuterVolumeSpecName: "kube-api-access-5k9wg") pod "54e651d7-c7c1-46d6-9097-f769d9e64d4e" (UID: "54e651d7-c7c1-46d6-9097-f769d9e64d4e"). InnerVolumeSpecName "kube-api-access-5k9wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.825183 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.825219 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-scripts" (OuterVolumeSpecName: "scripts") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.833397 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bba36a0-936a-4da3-a23b-79068d0c437c-kube-api-access-pt7g5" (OuterVolumeSpecName: "kube-api-access-pt7g5") pod "8bba36a0-936a-4da3-a23b-79068d0c437c" (UID: "8bba36a0-936a-4da3-a23b-79068d0c437c"). InnerVolumeSpecName "kube-api-access-pt7g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.863473 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-config-data" (OuterVolumeSpecName: "config-data") pod "29f258e4-6012-4807-95a6-cce9ee5af3d8" (UID: "29f258e4-6012-4807-95a6-cce9ee5af3d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.873031 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54e651d7-c7c1-46d6-9097-f769d9e64d4e" (UID: "54e651d7-c7c1-46d6-9097-f769d9e64d4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.873585 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.894228 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29f258e4-6012-4807-95a6-cce9ee5af3d8" (UID: "29f258e4-6012-4807-95a6-cce9ee5af3d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.896797 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915464 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915493 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915502 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k9wg\" (UniqueName: \"kubernetes.io/projected/54e651d7-c7c1-46d6-9097-f769d9e64d4e-kube-api-access-5k9wg\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915513 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e651d7-c7c1-46d6-9097-f769d9e64d4e-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915522 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4ede23dd-82b8-42ef-bdcd-d4be5637e457-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915530 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915538 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f258e4-6012-4807-95a6-cce9ee5af3d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915545 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915566 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.915575 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt7g5\" (UniqueName: \"kubernetes.io/projected/8bba36a0-936a-4da3-a23b-79068d0c437c-kube-api-access-pt7g5\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.922148 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-config-data" (OuterVolumeSpecName: "config-data") pod "54e651d7-c7c1-46d6-9097-f769d9e64d4e" (UID: "54e651d7-c7c1-46d6-9097-f769d9e64d4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.926304 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-config-data" (OuterVolumeSpecName: "config-data") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.926343 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4ede23dd-82b8-42ef-bdcd-d4be5637e457" (UID: "4ede23dd-82b8-42ef-bdcd-d4be5637e457"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:13 crc kubenswrapper[4886]: I0314 08:50:13.959462 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.016148 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-combined-ca-bundle\") pod \"e521aeb3-adb2-4042-ad11-33d749d5506b\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.016251 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-db-sync-config-data\") pod \"e521aeb3-adb2-4042-ad11-33d749d5506b\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.016284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqh2p\" (UniqueName: \"kubernetes.io/projected/e521aeb3-adb2-4042-ad11-33d749d5506b-kube-api-access-bqh2p\") pod \"e521aeb3-adb2-4042-ad11-33d749d5506b\" (UID: \"e521aeb3-adb2-4042-ad11-33d749d5506b\") " Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.017594 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.017609 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ede23dd-82b8-42ef-bdcd-d4be5637e457-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.017648 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.017667 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e651d7-c7c1-46d6-9097-f769d9e64d4e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.020048 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e521aeb3-adb2-4042-ad11-33d749d5506b" (UID: "e521aeb3-adb2-4042-ad11-33d749d5506b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.020692 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e521aeb3-adb2-4042-ad11-33d749d5506b-kube-api-access-bqh2p" (OuterVolumeSpecName: "kube-api-access-bqh2p") pod "e521aeb3-adb2-4042-ad11-33d749d5506b" (UID: "e521aeb3-adb2-4042-ad11-33d749d5506b"). InnerVolumeSpecName "kube-api-access-bqh2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.043240 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e521aeb3-adb2-4042-ad11-33d749d5506b" (UID: "e521aeb3-adb2-4042-ad11-33d749d5506b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.120102 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.120435 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e521aeb3-adb2-4042-ad11-33d749d5506b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.120445 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqh2p\" (UniqueName: \"kubernetes.io/projected/e521aeb3-adb2-4042-ad11-33d749d5506b-kube-api-access-bqh2p\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.181688 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.182097 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api-log" containerID="cri-o://ef351e23ca052eae7bcf7ffb583b52fe22b4f34d2d46be687a34a68e47050295" gracePeriod=30 Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.182203 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api" containerID="cri-o://33dd9fa372fec5d26f92e9ebc9abae7dab0da9bcc9d7ab17f90552563afe0963" gracePeriod=30 Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.482679 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wtzc" event={"ID":"e521aeb3-adb2-4042-ad11-33d749d5506b","Type":"ContainerDied","Data":"14532d4740a3a1638ccea934054fdead6a2fb3350d314eda5705f6f49cc838f9"} Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.482717 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14532d4740a3a1638ccea934054fdead6a2fb3350d314eda5705f6f49cc838f9" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.482735 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wtzc" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.487900 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e6e3f18-a42d-4d01-8df0-6dfc736974fc","Type":"ContainerStarted","Data":"47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe"} Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.489647 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerStarted","Data":"367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869"} Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.491671 4886 generic.go:334] "Generic (PLEG): container finished" podID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerID="ef351e23ca052eae7bcf7ffb583b52fe22b4f34d2d46be687a34a68e47050295" exitCode=143 Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.491746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3","Type":"ContainerDied","Data":"ef351e23ca052eae7bcf7ffb583b52fe22b4f34d2d46be687a34a68e47050295"} Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.495706 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5789dc8f-4vv5f" event={"ID":"9f9329a3-9a35-49a2-86ce-435b98d280f3","Type":"ContainerStarted","Data":"34737c15ff4a6a3340677a2fd809063ddbd01fc9f3a607e718c5b2480379f86d"} Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.496482 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.497931 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ctl2l" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.499207 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vwjh5" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.501460 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557970-k52mf" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.501509 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"54e651d7-c7c1-46d6-9097-f769d9e64d4e","Type":"ContainerDied","Data":"feac57f6227b7148b34b7e0c10ad442d30e4d6ae539c07b02ae2013658df4031"} Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.501560 4886 scope.go:117] "RemoveContainer" containerID="878a0fa57cd75beeab0b28bd00eb9b8ec0e1f82ad48e576a40cb00e9f6addb42" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.501634 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.501559 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.533675 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.533659365 podStartE2EDuration="11.533659365s" podCreationTimestamp="2026-03-14 08:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:14.514613286 +0000 UTC m=+1349.763064923" watchObservedRunningTime="2026-03-14 08:50:14.533659365 +0000 UTC m=+1349.782111002" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.559029 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c5789dc8f-4vv5f" podStartSLOduration=12.559011899 podStartE2EDuration="12.559011899s" podCreationTimestamp="2026-03-14 08:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:14.556039867 +0000 UTC m=+1349.804491504" watchObservedRunningTime="2026-03-14 08:50:14.559011899 +0000 UTC m=+1349.807463536" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.601453 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.646198 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.666848 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.693219 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.716982 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.717621 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-log" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.717692 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-log" Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.717744 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e521aeb3-adb2-4042-ad11-33d749d5506b" containerName="barbican-db-sync" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.717792 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e521aeb3-adb2-4042-ad11-33d749d5506b" containerName="barbican-db-sync" Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.717849 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" containerName="watcher-applier" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.717898 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" containerName="watcher-applier" Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.717961 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f258e4-6012-4807-95a6-cce9ee5af3d8" containerName="placement-db-sync" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718018 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f258e4-6012-4807-95a6-cce9ee5af3d8" containerName="placement-db-sync" Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.718086 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bba36a0-936a-4da3-a23b-79068d0c437c" containerName="oc" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718156 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bba36a0-936a-4da3-a23b-79068d0c437c" containerName="oc" Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.718212 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-httpd" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718260 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-httpd" Mar 14 08:50:14 crc kubenswrapper[4886]: E0314 08:50:14.718326 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d82e0e-40f0-45e0-b92f-df553a24bc5b" containerName="keystone-bootstrap" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718384 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d82e0e-40f0-45e0-b92f-df553a24bc5b" containerName="keystone-bootstrap" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718596 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bba36a0-936a-4da3-a23b-79068d0c437c" containerName="oc" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718655 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" containerName="watcher-applier" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718718 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-httpd" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718768 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e521aeb3-adb2-4042-ad11-33d749d5506b" containerName="barbican-db-sync" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718847 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d82e0e-40f0-45e0-b92f-df553a24bc5b" containerName="keystone-bootstrap" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718905 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f258e4-6012-4807-95a6-cce9ee5af3d8" containerName="placement-db-sync" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.718972 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" containerName="glance-log" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.719675 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.724249 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.751642 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.753392 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.782766 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.791338 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.810315 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.850328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b369685-07a0-4802-a5ed-d6288ed9b1c3-config-data\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.850383 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.850416 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.850453 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.850509 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b369685-07a0-4802-a5ed-d6288ed9b1c3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.850868 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9lk\" (UniqueName: \"kubernetes.io/projected/2b369685-07a0-4802-a5ed-d6288ed9b1c3-kube-api-access-dl9lk\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.861117 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49dn8\" (UniqueName: \"kubernetes.io/projected/8c271fab-7815-4aab-86c5-3e3919077e2e-kube-api-access-49dn8\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.861353 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b369685-07a0-4802-a5ed-d6288ed9b1c3-logs\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.861417 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.861512 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.863305 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.863520 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.906769 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.953096 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-d4575995d-lfmv5"] Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.955544 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968593 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968642 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968711 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968740 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b369685-07a0-4802-a5ed-d6288ed9b1c3-config-data\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968760 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968780 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968801 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968828 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b369685-07a0-4802-a5ed-d6288ed9b1c3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968858 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9lk\" (UniqueName: \"kubernetes.io/projected/2b369685-07a0-4802-a5ed-d6288ed9b1c3-kube-api-access-dl9lk\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968876 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49dn8\" (UniqueName: \"kubernetes.io/projected/8c271fab-7815-4aab-86c5-3e3919077e2e-kube-api-access-49dn8\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968917 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b369685-07a0-4802-a5ed-d6288ed9b1c3-logs\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.968933 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.969948 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lqdcq" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.970217 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.970756 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.974634 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.977328 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b369685-07a0-4802-a5ed-d6288ed9b1c3-logs\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.978334 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.978601 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.979684 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.980289 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.981949 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.984707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b369685-07a0-4802-a5ed-d6288ed9b1c3-config-data\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:14 crc kubenswrapper[4886]: I0314 08:50:14.997272 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d5d4f7d47-v8h75"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.002040 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.008758 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.012290 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b369685-07a0-4802-a5ed-d6288ed9b1c3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.023056 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.023470 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.023812 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-2prkb" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.024021 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.024277 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.024354 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.035393 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5c6c468c99-bvnb4"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.036958 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.045531 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.057235 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49dn8\" (UniqueName: \"kubernetes.io/projected/8c271fab-7815-4aab-86c5-3e3919077e2e-kube-api-access-49dn8\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.062283 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-q999d"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.068039 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9lk\" (UniqueName: \"kubernetes.io/projected/2b369685-07a0-4802-a5ed-d6288ed9b1c3-kube-api-access-dl9lk\") pod \"watcher-applier-0\" (UID: \"2b369685-07a0-4802-a5ed-d6288ed9b1c3\") " pod="openstack/watcher-applier-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077512 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077572 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcskm\" (UniqueName: \"kubernetes.io/projected/18c3fb7c-c71e-4c67-96a2-6e9455e67182-kube-api-access-wcskm\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077614 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data-custom\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077638 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-logs\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077707 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-credential-keys\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077725 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-combined-ca-bundle\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077805 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-config-data\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077838 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-public-tls-certs\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077874 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-fernet-keys\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077893 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls5pv\" (UniqueName: \"kubernetes.io/projected/7a298745-dd74-4ed3-b21b-648f2adb47dc-kube-api-access-ls5pv\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077937 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077967 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.077993 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data-custom\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.078007 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a298745-dd74-4ed3-b21b-648f2adb47dc-logs\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.078032 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-internal-tls-certs\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.078060 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phdg4\" (UniqueName: \"kubernetes.io/projected/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-kube-api-access-phdg4\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.078090 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-scripts\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.078165 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-combined-ca-bundle\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.101334 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.101926 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.109334 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557964-q999d"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.155583 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.182791 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-combined-ca-bundle\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.182915 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.182967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcskm\" (UniqueName: \"kubernetes.io/projected/18c3fb7c-c71e-4c67-96a2-6e9455e67182-kube-api-access-wcskm\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.182999 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data-custom\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.183045 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-logs\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.183097 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-credential-keys\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.183143 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-combined-ca-bundle\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.183224 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-config-data\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.183255 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-public-tls-certs\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.183605 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-logs\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.187074 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-combined-ca-bundle\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.187353 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-combined-ca-bundle\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.197025 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-credential-keys\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.197957 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-fernet-keys\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.197995 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls5pv\" (UniqueName: \"kubernetes.io/projected/7a298745-dd74-4ed3-b21b-648f2adb47dc-kube-api-access-ls5pv\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198057 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198158 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data-custom\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198174 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a298745-dd74-4ed3-b21b-648f2adb47dc-logs\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198216 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-internal-tls-certs\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198238 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phdg4\" (UniqueName: \"kubernetes.io/projected/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-kube-api-access-phdg4\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.198281 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-scripts\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.199623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data-custom\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.201664 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.201971 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a298745-dd74-4ed3-b21b-648f2adb47dc-logs\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.202806 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-config-data\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.206310 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data-custom\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.207737 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-internal-tls-certs\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.215623 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-public-tls-certs\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.226027 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcskm\" (UniqueName: \"kubernetes.io/projected/18c3fb7c-c71e-4c67-96a2-6e9455e67182-kube-api-access-wcskm\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.227431 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.232189 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.232595 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-scripts\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.241661 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phdg4\" (UniqueName: \"kubernetes.io/projected/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-kube-api-access-phdg4\") pod \"barbican-worker-5c6c468c99-bvnb4\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.234128 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls5pv\" (UniqueName: \"kubernetes.io/projected/7a298745-dd74-4ed3-b21b-648f2adb47dc-kube-api-access-ls5pv\") pod \"barbican-keystone-listener-d4575995d-lfmv5\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.250889 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d5d4f7d47-v8h75"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.261080 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18c3fb7c-c71e-4c67-96a2-6e9455e67182-fernet-keys\") pod \"keystone-6d5d4f7d47-v8h75\" (UID: \"18c3fb7c-c71e-4c67-96a2-6e9455e67182\") " pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.316002 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-lr4pt"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.335417 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerName="dnsmasq-dns" containerID="cri-o://6c0f1ceac2ede1fec8ce759945df6b0df7eb129151dd8a095858b5c03ed6bf13" gracePeriod=10 Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.338242 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.393528 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c6c468c99-bvnb4"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.431632 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.452560 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.482508 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.565596 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ede23dd-82b8-42ef-bdcd-d4be5637e457" path="/var/lib/kubelet/pods/4ede23dd-82b8-42ef-bdcd-d4be5637e457/volumes" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.566638 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e651d7-c7c1-46d6-9097-f769d9e64d4e" path="/var/lib/kubelet/pods/54e651d7-c7c1-46d6-9097-f769d9e64d4e/volumes" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.567101 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a392902-797e-4aff-a525-34c7aaaf36d2" path="/var/lib/kubelet/pods/7a392902-797e-4aff-a525-34c7aaaf36d2/volumes" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.587896 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d4575995d-lfmv5"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.587929 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4bfks"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.589262 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-64b7bfc49d-sz57z"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.622866 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.668751 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64b7bfc49d-sz57z"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.668796 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4bfks"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.668916 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.668809 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-869f94f9b-5hmcl"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.676646 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.681961 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-869f94f9b-5hmcl"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.682086 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.685282 4886 generic.go:334] "Generic (PLEG): container finished" podID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerID="6c0f1ceac2ede1fec8ce759945df6b0df7eb129151dd8a095858b5c03ed6bf13" exitCode=0 Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.685517 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" event={"ID":"ce19460f-e7c6-4212-8e85-ca624f694ac9","Type":"ContainerDied","Data":"6c0f1ceac2ede1fec8ce759945df6b0df7eb129151dd8a095858b5c03ed6bf13"} Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.687403 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xpzz5" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.687652 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.688282 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.693715 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.693984 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.743560 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.743767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-svc\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.743787 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.743935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfnct\" (UniqueName: \"kubernetes.io/projected/281ddf53-b107-4afd-b616-101cd308433b-kube-api-access-wfnct\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.744708 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-config\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.747251 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/281ddf53-b107-4afd-b616-101cd308433b-logs\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.747312 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x84v\" (UniqueName: \"kubernetes.io/projected/777c1eca-e276-4de3-9523-ee72e0891b05-kube-api-access-2x84v\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.747342 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.747371 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.747400 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-combined-ca-bundle\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.747497 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data-custom\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.763092 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-756657585d-2x84b"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.810158 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8557ccd47-58ztp"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.810439 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.825665 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.831414 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8557ccd47-58ztp"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.845163 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-756657585d-2x84b"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.853912 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-svc\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.853955 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854010 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-logs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854101 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-scripts\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854231 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfnct\" (UniqueName: \"kubernetes.io/projected/281ddf53-b107-4afd-b616-101cd308433b-kube-api-access-wfnct\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854259 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-combined-ca-bundle\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854480 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-config\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854518 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-config-data\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854552 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/281ddf53-b107-4afd-b616-101cd308433b-logs\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854573 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pf46\" (UniqueName: \"kubernetes.io/projected/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-kube-api-access-4pf46\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854607 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x84v\" (UniqueName: \"kubernetes.io/projected/777c1eca-e276-4de3-9523-ee72e0891b05-kube-api-access-2x84v\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854627 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854651 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.854677 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-combined-ca-bundle\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.855057 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-svc\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.859228 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fd5f87754-lf26d"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.860928 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.876381 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.876856 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.877949 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/281ddf53-b107-4afd-b616-101cd308433b-logs\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.878391 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-config\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.878556 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-internal-tls-certs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.878602 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-public-tls-certs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.878640 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data-custom\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.878674 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.879448 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.891003 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-combined-ca-bundle\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.909535 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fd5f87754-lf26d"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.923238 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x84v\" (UniqueName: \"kubernetes.io/projected/777c1eca-e276-4de3-9523-ee72e0891b05-kube-api-access-2x84v\") pod \"dnsmasq-dns-688c87cc99-4bfks\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.924230 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.945674 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57dfd898bd-kzdvs"] Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.947672 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.951975 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data-custom\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.987622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfnct\" (UniqueName: \"kubernetes.io/projected/281ddf53-b107-4afd-b616-101cd308433b-kube-api-access-wfnct\") pod \"barbican-api-64b7bfc49d-sz57z\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991515 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-config-data-custom\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991559 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42ce024b-4e1d-4f45-9faa-5f637e5a8466-logs\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991580 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msdb8\" (UniqueName: \"kubernetes.io/projected/42ce024b-4e1d-4f45-9faa-5f637e5a8466-kube-api-access-msdb8\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991633 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46e268aa-326d-42d7-936d-3e4d120dfeb6-logs\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991651 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-config-data\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991682 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxtkk\" (UniqueName: \"kubernetes.io/projected/00edb071-fbb5-4e88-8370-a2c76ad13a6c-kube-api-access-mxtkk\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991722 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-combined-ca-bundle\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991817 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-config-data\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991833 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data-custom\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991859 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-combined-ca-bundle\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991884 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pf46\" (UniqueName: \"kubernetes.io/projected/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-kube-api-access-4pf46\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.991928 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-combined-ca-bundle\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992005 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-internal-tls-certs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992026 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-public-tls-certs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992106 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-config-data-custom\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992162 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-combined-ca-bundle\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992262 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-logs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992317 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992342 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-config-data\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992362 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00edb071-fbb5-4e88-8370-a2c76ad13a6c-logs\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992387 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-scripts\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.992475 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dnl8\" (UniqueName: \"kubernetes.io/projected/46e268aa-326d-42d7-936d-3e4d120dfeb6-kube-api-access-8dnl8\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:15 crc kubenswrapper[4886]: I0314 08:50:15.994442 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-logs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.011855 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.012049 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.023614 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-combined-ca-bundle\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.025771 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-public-tls-certs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.025851 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57dfd898bd-kzdvs"] Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.039779 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-internal-tls-certs\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.044989 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pf46\" (UniqueName: \"kubernetes.io/projected/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-kube-api-access-4pf46\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.091358 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-config-data\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.091477 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-scripts\") pod \"placement-869f94f9b-5hmcl\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-scripts\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094347 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dnl8\" (UniqueName: \"kubernetes.io/projected/46e268aa-326d-42d7-936d-3e4d120dfeb6-kube-api-access-8dnl8\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094369 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-config-data-custom\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094387 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42ce024b-4e1d-4f45-9faa-5f637e5a8466-logs\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094402 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msdb8\" (UniqueName: \"kubernetes.io/projected/42ce024b-4e1d-4f45-9faa-5f637e5a8466-kube-api-access-msdb8\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094469 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46e268aa-326d-42d7-936d-3e4d120dfeb6-logs\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094491 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-config-data\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094519 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtkk\" (UniqueName: \"kubernetes.io/projected/00edb071-fbb5-4e88-8370-a2c76ad13a6c-kube-api-access-mxtkk\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094544 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-public-tls-certs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094590 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-logs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094618 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data-custom\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094637 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdd8p\" (UniqueName: \"kubernetes.io/projected/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-kube-api-access-bdd8p\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094657 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-combined-ca-bundle\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094696 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-combined-ca-bundle\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094753 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-config-data-custom\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094773 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-combined-ca-bundle\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094912 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-combined-ca-bundle\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-config-data\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094976 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.094997 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-config-data\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.095013 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00edb071-fbb5-4e88-8370-a2c76ad13a6c-logs\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.095044 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-internal-tls-certs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.100399 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46e268aa-326d-42d7-936d-3e4d120dfeb6-logs\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.100536 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42ce024b-4e1d-4f45-9faa-5f637e5a8466-logs\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.104642 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-combined-ca-bundle\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.106831 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-config-data-custom\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.110055 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00edb071-fbb5-4e88-8370-a2c76ad13a6c-logs\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.113107 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-config-data-custom\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.118725 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-combined-ca-bundle\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.119198 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-config-data\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.120192 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.129002 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxtkk\" (UniqueName: \"kubernetes.io/projected/00edb071-fbb5-4e88-8370-a2c76ad13a6c-kube-api-access-mxtkk\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.133311 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dnl8\" (UniqueName: \"kubernetes.io/projected/46e268aa-326d-42d7-936d-3e4d120dfeb6-kube-api-access-8dnl8\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.138101 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data-custom\") pod \"barbican-api-6fd5f87754-lf26d\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.138155 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msdb8\" (UniqueName: \"kubernetes.io/projected/42ce024b-4e1d-4f45-9faa-5f637e5a8466-kube-api-access-msdb8\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.138761 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e268aa-326d-42d7-936d-3e4d120dfeb6-combined-ca-bundle\") pod \"barbican-keystone-listener-756657585d-2x84b\" (UID: \"46e268aa-326d-42d7-936d-3e4d120dfeb6\") " pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.138854 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42ce024b-4e1d-4f45-9faa-5f637e5a8466-config-data\") pod \"barbican-worker-8557ccd47-58ztp\" (UID: \"42ce024b-4e1d-4f45-9faa-5f637e5a8466\") " pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198517 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-public-tls-certs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198566 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-logs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198600 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdd8p\" (UniqueName: \"kubernetes.io/projected/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-kube-api-access-bdd8p\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198668 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-combined-ca-bundle\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198690 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-config-data\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198742 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-internal-tls-certs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.198775 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-scripts\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.202532 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-scripts\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.205887 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-public-tls-certs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.206152 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-logs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.221198 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-config-data\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.221433 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8557ccd47-58ztp" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.221941 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-combined-ca-bundle\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.224702 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-internal-tls-certs\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.249515 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdd8p\" (UniqueName: \"kubernetes.io/projected/c7681da8-2f5d-4ac8-ae5e-6549a3d3f764-kube-api-access-bdd8p\") pod \"placement-57dfd898bd-kzdvs\" (UID: \"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764\") " pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.256037 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-756657585d-2x84b" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.295597 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.303478 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.314901 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.324440 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.475698 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.704400 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2b369685-07a0-4802-a5ed-d6288ed9b1c3","Type":"ContainerStarted","Data":"5f9f2283d4ff2b46c83a94a3853cacba35c101a9da25d91c0e43dd248f985f79"} Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.767166 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c6c468c99-bvnb4"] Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.781480 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d4575995d-lfmv5"] Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.800837 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d5d4f7d47-v8h75"] Mar 14 08:50:16 crc kubenswrapper[4886]: W0314 08:50:16.894586 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e54e03a_ce8b_4f7d_a664_cc11daa6c786.slice/crio-6144141fc2fab83234ee0c4180769c7a492d27ca9997aa532e9154cf117e6433 WatchSource:0}: Error finding container 6144141fc2fab83234ee0c4180769c7a492d27ca9997aa532e9154cf117e6433: Status 404 returned error can't find the container with id 6144141fc2fab83234ee0c4180769c7a492d27ca9997aa532e9154cf117e6433 Mar 14 08:50:16 crc kubenswrapper[4886]: W0314 08:50:16.897840 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a298745_dd74_4ed3_b21b_648f2adb47dc.slice/crio-02ee9603f36134be041b74b32a2e353de7de8e3225d14a3c06ad1659dc31d5ba WatchSource:0}: Error finding container 02ee9603f36134be041b74b32a2e353de7de8e3225d14a3c06ad1659dc31d5ba: Status 404 returned error can't find the container with id 02ee9603f36134be041b74b32a2e353de7de8e3225d14a3c06ad1659dc31d5ba Mar 14 08:50:16 crc kubenswrapper[4886]: W0314 08:50:16.902830 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18c3fb7c_c71e_4c67_96a2_6e9455e67182.slice/crio-6ee09897130f6eca0bf288e149a4824eb997c30c357df073937b3c7ab3337737 WatchSource:0}: Error finding container 6ee09897130f6eca0bf288e149a4824eb997c30c357df073937b3c7ab3337737: Status 404 returned error can't find the container with id 6ee09897130f6eca0bf288e149a4824eb997c30c357df073937b3c7ab3337737 Mar 14 08:50:16 crc kubenswrapper[4886]: I0314 08:50:16.970881 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.140260 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-svc\") pod \"ce19460f-e7c6-4212-8e85-ca624f694ac9\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.140312 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-swift-storage-0\") pod \"ce19460f-e7c6-4212-8e85-ca624f694ac9\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.140333 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbwhc\" (UniqueName: \"kubernetes.io/projected/ce19460f-e7c6-4212-8e85-ca624f694ac9-kube-api-access-pbwhc\") pod \"ce19460f-e7c6-4212-8e85-ca624f694ac9\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.140383 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-config\") pod \"ce19460f-e7c6-4212-8e85-ca624f694ac9\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.140417 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-sb\") pod \"ce19460f-e7c6-4212-8e85-ca624f694ac9\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.140493 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-nb\") pod \"ce19460f-e7c6-4212-8e85-ca624f694ac9\" (UID: \"ce19460f-e7c6-4212-8e85-ca624f694ac9\") " Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.175742 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce19460f-e7c6-4212-8e85-ca624f694ac9-kube-api-access-pbwhc" (OuterVolumeSpecName: "kube-api-access-pbwhc") pod "ce19460f-e7c6-4212-8e85-ca624f694ac9" (UID: "ce19460f-e7c6-4212-8e85-ca624f694ac9"). InnerVolumeSpecName "kube-api-access-pbwhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.246782 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbwhc\" (UniqueName: \"kubernetes.io/projected/ce19460f-e7c6-4212-8e85-ca624f694ac9-kube-api-access-pbwhc\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.257296 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce19460f-e7c6-4212-8e85-ca624f694ac9" (UID: "ce19460f-e7c6-4212-8e85-ca624f694ac9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.279312 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce19460f-e7c6-4212-8e85-ca624f694ac9" (UID: "ce19460f-e7c6-4212-8e85-ca624f694ac9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.282910 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-config" (OuterVolumeSpecName: "config") pod "ce19460f-e7c6-4212-8e85-ca624f694ac9" (UID: "ce19460f-e7c6-4212-8e85-ca624f694ac9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.288329 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce19460f-e7c6-4212-8e85-ca624f694ac9" (UID: "ce19460f-e7c6-4212-8e85-ca624f694ac9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.328333 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce19460f-e7c6-4212-8e85-ca624f694ac9" (UID: "ce19460f-e7c6-4212-8e85-ca624f694ac9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.383447 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.383482 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.383498 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.383509 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.383524 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce19460f-e7c6-4212-8e85-ca624f694ac9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.633446 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-64b7bfc49d-sz57z"] Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.642287 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.171:9322/\": read tcp 10.217.0.2:58924->10.217.0.171:9322: read: connection reset by peer" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.642580 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.171:9322/\": read tcp 10.217.0.2:58940->10.217.0.171:9322: read: connection reset by peer" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.778525 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6frh2" event={"ID":"14718224-eaad-4caf-b13b-a60a9c2a9460","Type":"ContainerStarted","Data":"8e54e288669ae4cdf21748f2471236e2b5b8a8da57d9ac2835880115c4de79e7"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.784887 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d5d4f7d47-v8h75" event={"ID":"18c3fb7c-c71e-4c67-96a2-6e9455e67182","Type":"ContainerStarted","Data":"6ee09897130f6eca0bf288e149a4824eb997c30c357df073937b3c7ab3337737"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.809880 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c271fab-7815-4aab-86c5-3e3919077e2e","Type":"ContainerStarted","Data":"68b4492fc3239b01393e297469c729e37087d2813b6527f73b955bab7d082404"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.814690 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6frh2" podStartSLOduration=6.349622067 podStartE2EDuration="50.814669983s" podCreationTimestamp="2026-03-14 08:49:27 +0000 UTC" firstStartedPulling="2026-03-14 08:49:30.477974142 +0000 UTC m=+1305.726425769" lastFinishedPulling="2026-03-14 08:50:14.943022048 +0000 UTC m=+1350.191473685" observedRunningTime="2026-03-14 08:50:17.795627864 +0000 UTC m=+1353.044079501" watchObservedRunningTime="2026-03-14 08:50:17.814669983 +0000 UTC m=+1353.063121620" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.831358 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" event={"ID":"7a298745-dd74-4ed3-b21b-648f2adb47dc","Type":"ContainerStarted","Data":"02ee9603f36134be041b74b32a2e353de7de8e3225d14a3c06ad1659dc31d5ba"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.875476 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" event={"ID":"ce19460f-e7c6-4212-8e85-ca624f694ac9","Type":"ContainerDied","Data":"fa9541d5ec746697a7747a0ea618061ff36a22abdda6d9bd74c47edcad692aff"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.875531 4886 scope.go:117] "RemoveContainer" containerID="6c0f1ceac2ede1fec8ce759945df6b0df7eb129151dd8a095858b5c03ed6bf13" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.875674 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-lr4pt" Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.922254 4886 generic.go:334] "Generic (PLEG): container finished" podID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerID="33dd9fa372fec5d26f92e9ebc9abae7dab0da9bcc9d7ab17f90552563afe0963" exitCode=0 Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.922349 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3","Type":"ContainerDied","Data":"33dd9fa372fec5d26f92e9ebc9abae7dab0da9bcc9d7ab17f90552563afe0963"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.924909 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6c468c99-bvnb4" event={"ID":"1e54e03a-ce8b-4f7d-a664-cc11daa6c786","Type":"ContainerStarted","Data":"6144141fc2fab83234ee0c4180769c7a492d27ca9997aa532e9154cf117e6433"} Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.928901 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4bfks"] Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.946284 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-lr4pt"] Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.960197 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-lr4pt"] Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.969601 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-869f94f9b-5hmcl"] Mar 14 08:50:17 crc kubenswrapper[4886]: W0314 08:50:17.981249 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod777c1eca_e276_4de3_9523_ee72e0891b05.slice/crio-9bced53d6587b1ff9321d634cde46142add0bf3c3b4269e123eeb54c4971c11a WatchSource:0}: Error finding container 9bced53d6587b1ff9321d634cde46142add0bf3c3b4269e123eeb54c4971c11a: Status 404 returned error can't find the container with id 9bced53d6587b1ff9321d634cde46142add0bf3c3b4269e123eeb54c4971c11a Mar 14 08:50:17 crc kubenswrapper[4886]: I0314 08:50:17.995880 4886 scope.go:117] "RemoveContainer" containerID="9e5189e9f9707ab0527e23bf2ad563e7c5aecd0dbe461eccf1a043d62adccc5a" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.148018 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.280299 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7769c88f5b-8gr9x" podUID="46272ed5-a9f5-45eb-b9ba-58289ed822a7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.169:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.169:8443: connect: connection refused" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.426082 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-756657585d-2x84b"] Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.685242 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57dfd898bd-kzdvs"] Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.702388 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8557ccd47-58ztp"] Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.775213 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fd5f87754-lf26d"] Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.828382 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64b7bfc49d-sz57z"] Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.878398 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-dc48c5bd6-xmnxc"] Mar 14 08:50:18 crc kubenswrapper[4886]: E0314 08:50:18.878956 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerName="init" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.878971 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerName="init" Mar 14 08:50:18 crc kubenswrapper[4886]: E0314 08:50:18.879001 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerName="dnsmasq-dns" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.879008 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerName="dnsmasq-dns" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.879304 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" containerName="dnsmasq-dns" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.880867 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.883386 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.883536 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.891980 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dc48c5bd6-xmnxc"] Mar 14 08:50:18 crc kubenswrapper[4886]: W0314 08:50:18.943578 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00edb071_fbb5_4e88_8370_a2c76ad13a6c.slice/crio-f8b727d0a646738d088eb194933142dbeb13d0d1b0d4952cb79c4814e9c4193e WatchSource:0}: Error finding container f8b727d0a646738d088eb194933142dbeb13d0d1b0d4952cb79c4814e9c4193e: Status 404 returned error can't find the container with id f8b727d0a646738d088eb194933142dbeb13d0d1b0d4952cb79c4814e9c4193e Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.957514 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.958621 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8557ccd47-58ztp" event={"ID":"42ce024b-4e1d-4f45-9faa-5f637e5a8466","Type":"ContainerStarted","Data":"df3bf0124f3ec3594f786129ce132d2e2e3c48c00456486d41aaf2af91210caa"} Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.961287 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-756657585d-2x84b" event={"ID":"46e268aa-326d-42d7-936d-3e4d120dfeb6","Type":"ContainerStarted","Data":"30cd199474b6639af9ab1b3e71368a37d9a9dc1a37b2fb3311f2f677a830ab8a"} Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965351 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-internal-tls-certs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965418 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-config-data-custom\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965455 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-public-tls-certs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965473 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwtg5\" (UniqueName: \"kubernetes.io/projected/309604ae-1d2f-4f0a-9fa3-1960efc340b6-kube-api-access-lwtg5\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965570 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-config-data\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965596 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309604ae-1d2f-4f0a-9fa3-1960efc340b6-logs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.965639 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-combined-ca-bundle\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.969353 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d5d4f7d47-v8h75" event={"ID":"18c3fb7c-c71e-4c67-96a2-6e9455e67182","Type":"ContainerStarted","Data":"d953b32fd883b585a550c3473886582d60447ae966d0b85778f2ffa51feb4eee"} Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.970155 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.975836 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b7bfc49d-sz57z" event={"ID":"281ddf53-b107-4afd-b616-101cd308433b","Type":"ContainerStarted","Data":"2b84ba9e4b2f9ea2f1d5a951f9191bd98a5058bc92bdf171d98b0de0a8b3c821"} Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.975876 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b7bfc49d-sz57z" event={"ID":"281ddf53-b107-4afd-b616-101cd308433b","Type":"ContainerStarted","Data":"e1059e46606c1061cbf2fbae1ddcab6f6c0cd4762909cc9fcd5795a5c24d570f"} Mar 14 08:50:18 crc kubenswrapper[4886]: I0314 08:50:18.982770 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2b369685-07a0-4802-a5ed-d6288ed9b1c3","Type":"ContainerStarted","Data":"022d3c5a2b72f3be4738ff3b10f16f7720070055bf79272ad9b6ef57d819e7cb"} Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.004888 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-869f94f9b-5hmcl" event={"ID":"78d6b750-e5ce-4784-a7ee-3930cb52b4c1","Type":"ContainerStarted","Data":"8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554"} Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.004937 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-869f94f9b-5hmcl" event={"ID":"78d6b750-e5ce-4784-a7ee-3930cb52b4c1","Type":"ContainerStarted","Data":"0a6f9ad3b5f99b8c68d28bad86eead224469b4e9a36507cb66ea9decb77ecefb"} Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.011798 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d5d4f7d47-v8h75" podStartSLOduration=5.011778473 podStartE2EDuration="5.011778473s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:19.007978717 +0000 UTC m=+1354.256430354" watchObservedRunningTime="2026-03-14 08:50:19.011778473 +0000 UTC m=+1354.260230110" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.012412 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" event={"ID":"777c1eca-e276-4de3-9523-ee72e0891b05","Type":"ContainerStarted","Data":"9bced53d6587b1ff9321d634cde46142add0bf3c3b4269e123eeb54c4971c11a"} Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.035552 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57dfd898bd-kzdvs" event={"ID":"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764","Type":"ContainerStarted","Data":"5a59e564eedd81e53ee75b963511a61b2972723b4a55b5966ace8739edfa19e4"} Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.038724 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3","Type":"ContainerDied","Data":"05fa7c629ce0013359949c6fdc867377ad78ffa4750529f3fedc951db8803b5e"} Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.038776 4886 scope.go:117] "RemoveContainer" containerID="33dd9fa372fec5d26f92e9ebc9abae7dab0da9bcc9d7ab17f90552563afe0963" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.038904 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.043120 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=5.043094213 podStartE2EDuration="5.043094213s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:19.032651623 +0000 UTC m=+1354.281103260" watchObservedRunningTime="2026-03-14 08:50:19.043094213 +0000 UTC m=+1354.291545850" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.079065 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p29gb\" (UniqueName: \"kubernetes.io/projected/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-kube-api-access-p29gb\") pod \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.079139 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-combined-ca-bundle\") pod \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.079263 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-config-data\") pod \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.079317 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-custom-prometheus-ca\") pod \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.079350 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-logs\") pod \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\" (UID: \"46fb99a1-d0f5-4538-80d4-9d17ffe6bed3\") " Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.081060 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-config-data-custom\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.081199 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-public-tls-certs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.081223 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwtg5\" (UniqueName: \"kubernetes.io/projected/309604ae-1d2f-4f0a-9fa3-1960efc340b6-kube-api-access-lwtg5\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.086926 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-config-data\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.086985 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309604ae-1d2f-4f0a-9fa3-1960efc340b6-logs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.093348 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-combined-ca-bundle\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.094555 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-internal-tls-certs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.095620 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-logs" (OuterVolumeSpecName: "logs") pod "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" (UID: "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.112868 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309604ae-1d2f-4f0a-9fa3-1960efc340b6-logs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.113713 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-config-data-custom\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.116642 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-public-tls-certs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.131579 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-internal-tls-certs\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.143422 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-config-data\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.149592 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309604ae-1d2f-4f0a-9fa3-1960efc340b6-combined-ca-bundle\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.156945 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwtg5\" (UniqueName: \"kubernetes.io/projected/309604ae-1d2f-4f0a-9fa3-1960efc340b6-kube-api-access-lwtg5\") pod \"barbican-api-dc48c5bd6-xmnxc\" (UID: \"309604ae-1d2f-4f0a-9fa3-1960efc340b6\") " pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.190465 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-kube-api-access-p29gb" (OuterVolumeSpecName: "kube-api-access-p29gb") pod "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" (UID: "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3"). InnerVolumeSpecName "kube-api-access-p29gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.193642 4886 scope.go:117] "RemoveContainer" containerID="ef351e23ca052eae7bcf7ffb583b52fe22b4f34d2d46be687a34a68e47050295" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.198880 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p29gb\" (UniqueName: \"kubernetes.io/projected/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-kube-api-access-p29gb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.198904 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.267735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.452372 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce19460f-e7c6-4212-8e85-ca624f694ac9" path="/var/lib/kubelet/pods/ce19460f-e7c6-4212-8e85-ca624f694ac9/volumes" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.793721 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" (UID: "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.802903 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" (UID: "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.830077 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-config-data" (OuterVolumeSpecName: "config-data") pod "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" (UID: "46fb99a1-d0f5-4538-80d4-9d17ffe6bed3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.837416 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.837449 4886 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.837463 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:19 crc kubenswrapper[4886]: I0314 08:50:19.992737 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.007534 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.044599 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:20 crc kubenswrapper[4886]: E0314 08:50:20.049728 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.057272 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api" Mar 14 08:50:20 crc kubenswrapper[4886]: E0314 08:50:20.057596 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api-log" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.057680 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api-log" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.058207 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api-log" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.058295 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" containerName="watcher-api" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.059590 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.063272 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.064200 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.065165 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.094896 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.104341 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.108179 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dc48c5bd6-xmnxc"] Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.146263 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-public-tls-certs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.146849 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkz9x\" (UniqueName: \"kubernetes.io/projected/cf474a1d-5a57-4669-8ced-7d0c8decbd70-kube-api-access-pkz9x\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.146900 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-config-data\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.146947 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf474a1d-5a57-4669-8ced-7d0c8decbd70-logs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.147027 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.147050 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.147090 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.151364 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c271fab-7815-4aab-86c5-3e3919077e2e","Type":"ContainerStarted","Data":"631c2e8c5eeadab20304b857dacd13e38e482252eba22abb961e0ca93f650019"} Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.153984 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b7bfc49d-sz57z" event={"ID":"281ddf53-b107-4afd-b616-101cd308433b","Type":"ContainerStarted","Data":"176bbbad339dc424dbd60e2b953e6aea13c9455cbe5452528488a5e7bb697004"} Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.154324 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.154354 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.154345 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64b7bfc49d-sz57z" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api" containerID="cri-o://176bbbad339dc424dbd60e2b953e6aea13c9455cbe5452528488a5e7bb697004" gracePeriod=30 Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.154255 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-64b7bfc49d-sz57z" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api-log" containerID="cri-o://2b84ba9e4b2f9ea2f1d5a951f9191bd98a5058bc92bdf171d98b0de0a8b3c821" gracePeriod=30 Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.160395 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd5f87754-lf26d" event={"ID":"00edb071-fbb5-4e88-8370-a2c76ad13a6c","Type":"ContainerStarted","Data":"f8b727d0a646738d088eb194933142dbeb13d0d1b0d4952cb79c4814e9c4193e"} Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.175529 4886 generic.go:334] "Generic (PLEG): container finished" podID="777c1eca-e276-4de3-9523-ee72e0891b05" containerID="b72ccf6553947605f5774dcf6cf52a379782aa9a503fecbf75403c346a256df5" exitCode=0 Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.175687 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" event={"ID":"777c1eca-e276-4de3-9523-ee72e0891b05","Type":"ContainerDied","Data":"b72ccf6553947605f5774dcf6cf52a379782aa9a503fecbf75403c346a256df5"} Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.187890 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-64b7bfc49d-sz57z" podStartSLOduration=6.187866689 podStartE2EDuration="6.187866689s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:20.175523976 +0000 UTC m=+1355.423975613" watchObservedRunningTime="2026-03-14 08:50:20.187866689 +0000 UTC m=+1355.436318326" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.248830 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.248887 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.248953 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.249000 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-public-tls-certs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.249021 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkz9x\" (UniqueName: \"kubernetes.io/projected/cf474a1d-5a57-4669-8ced-7d0c8decbd70-kube-api-access-pkz9x\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.249053 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-config-data\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.249105 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf474a1d-5a57-4669-8ced-7d0c8decbd70-logs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.250562 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf474a1d-5a57-4669-8ced-7d0c8decbd70-logs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.253697 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-public-tls-certs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.254431 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.254641 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.271001 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkz9x\" (UniqueName: \"kubernetes.io/projected/cf474a1d-5a57-4669-8ced-7d0c8decbd70-kube-api-access-pkz9x\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.271079 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.273915 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf474a1d-5a57-4669-8ced-7d0c8decbd70-config-data\") pod \"watcher-api-0\" (UID: \"cf474a1d-5a57-4669-8ced-7d0c8decbd70\") " pod="openstack/watcher-api-0" Mar 14 08:50:20 crc kubenswrapper[4886]: I0314 08:50:20.397089 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.034671 4886 scope.go:117] "RemoveContainer" containerID="26048e0b3b800cf71b02ff1de10775ae3cc126327fcb934d7363c76de88f7810" Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.190826 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc48c5bd6-xmnxc" event={"ID":"309604ae-1d2f-4f0a-9fa3-1960efc340b6","Type":"ContainerStarted","Data":"a616373ab59ef398c01351d155107c6c9b01b3aed0b945246ebcccc123ae1ce9"} Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.192954 4886 generic.go:334] "Generic (PLEG): container finished" podID="281ddf53-b107-4afd-b616-101cd308433b" containerID="176bbbad339dc424dbd60e2b953e6aea13c9455cbe5452528488a5e7bb697004" exitCode=0 Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.192984 4886 generic.go:334] "Generic (PLEG): container finished" podID="281ddf53-b107-4afd-b616-101cd308433b" containerID="2b84ba9e4b2f9ea2f1d5a951f9191bd98a5058bc92bdf171d98b0de0a8b3c821" exitCode=143 Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.193022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b7bfc49d-sz57z" event={"ID":"281ddf53-b107-4afd-b616-101cd308433b","Type":"ContainerDied","Data":"176bbbad339dc424dbd60e2b953e6aea13c9455cbe5452528488a5e7bb697004"} Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.193044 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b7bfc49d-sz57z" event={"ID":"281ddf53-b107-4afd-b616-101cd308433b","Type":"ContainerDied","Data":"2b84ba9e4b2f9ea2f1d5a951f9191bd98a5058bc92bdf171d98b0de0a8b3c821"} Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.196465 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-869f94f9b-5hmcl" event={"ID":"78d6b750-e5ce-4784-a7ee-3930cb52b4c1","Type":"ContainerStarted","Data":"b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2"} Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.196945 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.196975 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.215261 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-869f94f9b-5hmcl" podStartSLOduration=7.215242944 podStartE2EDuration="7.215242944s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:21.214574465 +0000 UTC m=+1356.463026102" watchObservedRunningTime="2026-03-14 08:50:21.215242944 +0000 UTC m=+1356.463694581" Mar 14 08:50:21 crc kubenswrapper[4886]: I0314 08:50:21.432447 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fb99a1-d0f5-4538-80d4-9d17ffe6bed3" path="/var/lib/kubelet/pods/46fb99a1-d0f5-4538-80d4-9d17ffe6bed3/volumes" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.598322 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.700377 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data\") pod \"281ddf53-b107-4afd-b616-101cd308433b\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.700803 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/281ddf53-b107-4afd-b616-101cd308433b-logs\") pod \"281ddf53-b107-4afd-b616-101cd308433b\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.700914 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-combined-ca-bundle\") pod \"281ddf53-b107-4afd-b616-101cd308433b\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.700974 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data-custom\") pod \"281ddf53-b107-4afd-b616-101cd308433b\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.701005 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfnct\" (UniqueName: \"kubernetes.io/projected/281ddf53-b107-4afd-b616-101cd308433b-kube-api-access-wfnct\") pod \"281ddf53-b107-4afd-b616-101cd308433b\" (UID: \"281ddf53-b107-4afd-b616-101cd308433b\") " Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.702065 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/281ddf53-b107-4afd-b616-101cd308433b-logs" (OuterVolumeSpecName: "logs") pod "281ddf53-b107-4afd-b616-101cd308433b" (UID: "281ddf53-b107-4afd-b616-101cd308433b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.706578 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "281ddf53-b107-4afd-b616-101cd308433b" (UID: "281ddf53-b107-4afd-b616-101cd308433b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.707469 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281ddf53-b107-4afd-b616-101cd308433b-kube-api-access-wfnct" (OuterVolumeSpecName: "kube-api-access-wfnct") pod "281ddf53-b107-4afd-b616-101cd308433b" (UID: "281ddf53-b107-4afd-b616-101cd308433b"). InnerVolumeSpecName "kube-api-access-wfnct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.731258 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "281ddf53-b107-4afd-b616-101cd308433b" (UID: "281ddf53-b107-4afd-b616-101cd308433b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.778770 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data" (OuterVolumeSpecName: "config-data") pod "281ddf53-b107-4afd-b616-101cd308433b" (UID: "281ddf53-b107-4afd-b616-101cd308433b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.804692 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.804735 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/281ddf53-b107-4afd-b616-101cd308433b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.804748 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.804765 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/281ddf53-b107-4afd-b616-101cd308433b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.804778 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfnct\" (UniqueName: \"kubernetes.io/projected/281ddf53-b107-4afd-b616-101cd308433b-kube-api-access-wfnct\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:22 crc kubenswrapper[4886]: I0314 08:50:22.984749 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.216267 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c271fab-7815-4aab-86c5-3e3919077e2e","Type":"ContainerStarted","Data":"1b4bce9c22bd8d147e8d50bf51f21a60bbb27d8a53b19b69ed2ada0d4426195e"} Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.220378 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-64b7bfc49d-sz57z" event={"ID":"281ddf53-b107-4afd-b616-101cd308433b","Type":"ContainerDied","Data":"e1059e46606c1061cbf2fbae1ddcab6f6c0cd4762909cc9fcd5795a5c24d570f"} Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.220429 4886 scope.go:117] "RemoveContainer" containerID="176bbbad339dc424dbd60e2b953e6aea13c9455cbe5452528488a5e7bb697004" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.220707 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-64b7bfc49d-sz57z" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.225088 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd5f87754-lf26d" event={"ID":"00edb071-fbb5-4e88-8370-a2c76ad13a6c","Type":"ContainerStarted","Data":"33ef06fc60d6075241b9c87a3f787f8ce5d96bfd1cd8bd009555ebed64f9a14e"} Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.228686 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57dfd898bd-kzdvs" event={"ID":"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764","Type":"ContainerStarted","Data":"50b561675a82fbdacea8fe79cf5da5c983dcf0e849a6bde2312fb121d312d6f1"} Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.253446 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.253424232 podStartE2EDuration="9.253424232s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:23.246677654 +0000 UTC m=+1358.495129301" watchObservedRunningTime="2026-03-14 08:50:23.253424232 +0000 UTC m=+1358.501875879" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.279013 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-64b7bfc49d-sz57z"] Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.288556 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-64b7bfc49d-sz57z"] Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.434595 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281ddf53-b107-4afd-b616-101cd308433b" path="/var/lib/kubelet/pods/281ddf53-b107-4afd-b616-101cd308433b/volumes" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.499784 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.499871 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.551247 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 08:50:23 crc kubenswrapper[4886]: I0314 08:50:23.552692 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 08:50:24 crc kubenswrapper[4886]: I0314 08:50:24.248287 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 08:50:24 crc kubenswrapper[4886]: I0314 08:50:24.248334 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 08:50:24 crc kubenswrapper[4886]: I0314 08:50:24.528166 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.537535 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.537571 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.537582 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.537648 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.537696 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.538668 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.578888 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:25 crc kubenswrapper[4886]: I0314 08:50:25.596594 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.066796 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.066903 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.067576 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.069740 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55e54dc7fbdd549134d3b20dfd9642dda565b3dd8cfe4e3b853534c01d92f8db"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.069845 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://55e54dc7fbdd549134d3b20dfd9642dda565b3dd8cfe4e3b853534c01d92f8db" gracePeriod=600 Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.505268 4886 generic.go:334] "Generic (PLEG): container finished" podID="14718224-eaad-4caf-b13b-a60a9c2a9460" containerID="8e54e288669ae4cdf21748f2471236e2b5b8a8da57d9ac2835880115c4de79e7" exitCode=0 Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.505500 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6frh2" event={"ID":"14718224-eaad-4caf-b13b-a60a9c2a9460","Type":"ContainerDied","Data":"8e54e288669ae4cdf21748f2471236e2b5b8a8da57d9ac2835880115c4de79e7"} Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.509002 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="55e54dc7fbdd549134d3b20dfd9642dda565b3dd8cfe4e3b853534c01d92f8db" exitCode=0 Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.509066 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"55e54dc7fbdd549134d3b20dfd9642dda565b3dd8cfe4e3b853534c01d92f8db"} Mar 14 08:50:26 crc kubenswrapper[4886]: I0314 08:50:26.510274 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:27 crc kubenswrapper[4886]: W0314 08:50:27.122043 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf474a1d_5a57_4669_8ced_7d0c8decbd70.slice/crio-5891c4c534b94e7f0ee88a8d61901537a6d31d46c1b8b4123c300db812e3797d WatchSource:0}: Error finding container 5891c4c534b94e7f0ee88a8d61901537a6d31d46c1b8b4123c300db812e3797d: Status 404 returned error can't find the container with id 5891c4c534b94e7f0ee88a8d61901537a6d31d46c1b8b4123c300db812e3797d Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.144388 4886 scope.go:117] "RemoveContainer" containerID="2b84ba9e4b2f9ea2f1d5a951f9191bd98a5058bc92bdf171d98b0de0a8b3c821" Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.524565 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cf474a1d-5a57-4669-8ced-7d0c8decbd70","Type":"ContainerStarted","Data":"5891c4c534b94e7f0ee88a8d61901537a6d31d46c1b8b4123c300db812e3797d"} Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.528168 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" event={"ID":"777c1eca-e276-4de3-9523-ee72e0891b05","Type":"ContainerStarted","Data":"7a836cec3b9c03df1808d537afe3bf262731f5c48a27049eae8803aa4347762d"} Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.528235 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.537006 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57dfd898bd-kzdvs" event={"ID":"c7681da8-2f5d-4ac8-ae5e-6549a3d3f764","Type":"ContainerStarted","Data":"aec86215401013775fbcd84eb2cb3f6a9d8557876412b47974da407597a297a4"} Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.537283 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.539005 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-57dfd898bd-kzdvs" podUID="c7681da8-2f5d-4ac8-ae5e-6549a3d3f764" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.188:8778/\": dial tcp 10.217.0.188:8778: connect: connection refused" Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.539397 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc48c5bd6-xmnxc" event={"ID":"309604ae-1d2f-4f0a-9fa3-1960efc340b6","Type":"ContainerStarted","Data":"493a54b99c632237400fdd30b29c6199b0d73e1108a43f11cc2a4360e1fde41c"} Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.539461 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.556972 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" podStartSLOduration=13.556951993 podStartE2EDuration="13.556951993s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:27.550389326 +0000 UTC m=+1362.798840983" watchObservedRunningTime="2026-03-14 08:50:27.556951993 +0000 UTC m=+1362.805403630" Mar 14 08:50:27 crc kubenswrapper[4886]: I0314 08:50:27.584868 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57dfd898bd-kzdvs" podStartSLOduration=12.584850685 podStartE2EDuration="12.584850685s" podCreationTimestamp="2026-03-14 08:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:27.577181797 +0000 UTC m=+1362.825633434" watchObservedRunningTime="2026-03-14 08:50:27.584850685 +0000 UTC m=+1362.833302322" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.146663 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Mar 14 08:50:28 crc kubenswrapper[4886]: E0314 08:50:28.183640 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 08:50:28 crc kubenswrapper[4886]: E0314 08:50:28.184855 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 08:50:28 crc kubenswrapper[4886]: E0314 08:50:28.186534 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Mar 14 08:50:28 crc kubenswrapper[4886]: E0314 08:50:28.186568 4886 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" containerName="watcher-decision-engine" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.268074 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7769c88f5b-8gr9x" podUID="46272ed5-a9f5-45eb-b9ba-58289ed822a7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.169:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.169:8443: connect: connection refused" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.362697 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.375258 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.375373 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.376767 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 08:50:28 crc kubenswrapper[4886]: I0314 08:50:28.566508 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.569482 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd5f87754-lf26d" event={"ID":"00edb071-fbb5-4e88-8370-a2c76ad13a6c","Type":"ContainerStarted","Data":"8c743e44789d946fe248b9fb8b21d0de73cd5e05715b311be36c45268e7aad22"} Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.569959 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.570348 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.596771 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fd5f87754-lf26d" podStartSLOduration=14.596745912 podStartE2EDuration="14.596745912s" podCreationTimestamp="2026-03-14 08:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:29.591646637 +0000 UTC m=+1364.840098284" watchObservedRunningTime="2026-03-14 08:50:29.596745912 +0000 UTC m=+1364.845197549" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.697622 4886 scope.go:117] "RemoveContainer" containerID="7738a099ca236f81766457fa9d5c5fb3f046ded018935c8fbb545666a40042f4" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.831885 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6frh2" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937170 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4cjt\" (UniqueName: \"kubernetes.io/projected/14718224-eaad-4caf-b13b-a60a9c2a9460-kube-api-access-d4cjt\") pod \"14718224-eaad-4caf-b13b-a60a9c2a9460\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937337 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14718224-eaad-4caf-b13b-a60a9c2a9460-etc-machine-id\") pod \"14718224-eaad-4caf-b13b-a60a9c2a9460\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937413 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-config-data\") pod \"14718224-eaad-4caf-b13b-a60a9c2a9460\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937466 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-combined-ca-bundle\") pod \"14718224-eaad-4caf-b13b-a60a9c2a9460\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937495 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-db-sync-config-data\") pod \"14718224-eaad-4caf-b13b-a60a9c2a9460\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937510 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14718224-eaad-4caf-b13b-a60a9c2a9460-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14718224-eaad-4caf-b13b-a60a9c2a9460" (UID: "14718224-eaad-4caf-b13b-a60a9c2a9460"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.937596 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-scripts\") pod \"14718224-eaad-4caf-b13b-a60a9c2a9460\" (UID: \"14718224-eaad-4caf-b13b-a60a9c2a9460\") " Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.938198 4886 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14718224-eaad-4caf-b13b-a60a9c2a9460-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.944237 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14718224-eaad-4caf-b13b-a60a9c2a9460-kube-api-access-d4cjt" (OuterVolumeSpecName: "kube-api-access-d4cjt") pod "14718224-eaad-4caf-b13b-a60a9c2a9460" (UID: "14718224-eaad-4caf-b13b-a60a9c2a9460"). InnerVolumeSpecName "kube-api-access-d4cjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.944284 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "14718224-eaad-4caf-b13b-a60a9c2a9460" (UID: "14718224-eaad-4caf-b13b-a60a9c2a9460"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.953257 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-scripts" (OuterVolumeSpecName: "scripts") pod "14718224-eaad-4caf-b13b-a60a9c2a9460" (UID: "14718224-eaad-4caf-b13b-a60a9c2a9460"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.984215 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14718224-eaad-4caf-b13b-a60a9c2a9460" (UID: "14718224-eaad-4caf-b13b-a60a9c2a9460"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:29 crc kubenswrapper[4886]: I0314 08:50:29.987180 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.012653 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-config-data" (OuterVolumeSpecName: "config-data") pod "14718224-eaad-4caf-b13b-a60a9c2a9460" (UID: "14718224-eaad-4caf-b13b-a60a9c2a9460"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.039590 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.039618 4886 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.039627 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.039635 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4cjt\" (UniqueName: \"kubernetes.io/projected/14718224-eaad-4caf-b13b-a60a9c2a9460-kube-api-access-d4cjt\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.039648 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14718224-eaad-4caf-b13b-a60a9c2a9460-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.244223 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 08:50:30 crc kubenswrapper[4886]: E0314 08:50:30.532286 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.609389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" event={"ID":"7a298745-dd74-4ed3-b21b-648f2adb47dc","Type":"ContainerStarted","Data":"a774d7ecff810007d347300dd5019b4603ff3a9eff33e055d7e88e41b6cbfc9e"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.640163 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dc48c5bd6-xmnxc" event={"ID":"309604ae-1d2f-4f0a-9fa3-1960efc340b6","Type":"ContainerStarted","Data":"790d942e1a607da735e1147982af3c43fef25e2a6e99e960842bf9db3ee3d400"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.641289 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.641337 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.644442 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.653535 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8557ccd47-58ztp" event={"ID":"42ce024b-4e1d-4f45-9faa-5f637e5a8466","Type":"ContainerStarted","Data":"3c3b22c4b34d631cc4c1bddaea5888c8222acee92ab516e18e6e44de1b6dbe6b"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.656972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6frh2" event={"ID":"14718224-eaad-4caf-b13b-a60a9c2a9460","Type":"ContainerDied","Data":"ff144da160e7ce05c94bd894f4fbbd4be1875b6704b538dee35c710ce46e4f39"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.657010 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff144da160e7ce05c94bd894f4fbbd4be1875b6704b538dee35c710ce46e4f39" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.657090 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6frh2" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.676367 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-dc48c5bd6-xmnxc" podStartSLOduration=12.676352493 podStartE2EDuration="12.676352493s" podCreationTimestamp="2026-03-14 08:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:30.669966891 +0000 UTC m=+1365.918418528" watchObservedRunningTime="2026-03-14 08:50:30.676352493 +0000 UTC m=+1365.924804130" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.688057 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-756657585d-2x84b" event={"ID":"46e268aa-326d-42d7-936d-3e4d120dfeb6","Type":"ContainerStarted","Data":"0b7f38a51956000b487b00e275de17ffce9c3929647848886d4805a773614208"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.712386 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cf474a1d-5a57-4669-8ced-7d0c8decbd70","Type":"ContainerStarted","Data":"842516f8993b621525ead22d1440f518692cb3e0b21b0bef4a6268bef8532042"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.719658 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6c468c99-bvnb4" event={"ID":"1e54e03a-ce8b-4f7d-a664-cc11daa6c786","Type":"ContainerStarted","Data":"e7961e21a3760933ecc66f9afc836f2931d46adf93c83f5459e2aedfe0f1cd2a"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.743451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.753674 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="ceilometer-notification-agent" containerID="cri-o://c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d" gracePeriod=30 Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.754071 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerStarted","Data":"9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce"} Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.754806 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.754908 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="proxy-httpd" containerID="cri-o://9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce" gracePeriod=30 Mar 14 08:50:30 crc kubenswrapper[4886]: I0314 08:50:30.754987 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="sg-core" containerID="cri-o://367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869" gracePeriod=30 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.090673 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c5789dc8f-4vv5f"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.091193 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c5789dc8f-4vv5f" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-api" containerID="cri-o://ddfe2022d1041feed4b3664b5ffa4eac881559b738c1046c39be568571c63d8f" gracePeriod=30 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.092044 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c5789dc8f-4vv5f" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-httpd" containerID="cri-o://34737c15ff4a6a3340677a2fd809063ddbd01fc9f3a607e718c5b2480379f86d" gracePeriod=30 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.109170 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5694fd5cb9-r8nzz"] Mar 14 08:50:31 crc kubenswrapper[4886]: E0314 08:50:31.109898 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api-log" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.109978 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api-log" Mar 14 08:50:31 crc kubenswrapper[4886]: E0314 08:50:31.110095 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14718224-eaad-4caf-b13b-a60a9c2a9460" containerName="cinder-db-sync" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.110199 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="14718224-eaad-4caf-b13b-a60a9c2a9460" containerName="cinder-db-sync" Mar 14 08:50:31 crc kubenswrapper[4886]: E0314 08:50:31.110253 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.110308 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.110553 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.110620 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="281ddf53-b107-4afd-b616-101cd308433b" containerName="barbican-api-log" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.110685 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="14718224-eaad-4caf-b13b-a60a9c2a9460" containerName="cinder-db-sync" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.111792 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.131693 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5694fd5cb9-r8nzz"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.163285 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7c5789dc8f-4vv5f" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9696/\": EOF" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.183220 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.184944 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.192761 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.193013 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.193051 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wkpjr" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.193195 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.195693 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-ovndb-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.195749 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-internal-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.195809 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-httpd-config\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.195930 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-public-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.195983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-combined-ca-bundle\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.196001 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8hgd\" (UniqueName: \"kubernetes.io/projected/5055b057-f745-43b5-8bd2-937ed8b29743-kube-api-access-q8hgd\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.196098 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-config\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.240956 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.299370 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4bfks"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.299586 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" containerName="dnsmasq-dns" containerID="cri-o://7a836cec3b9c03df1808d537afe3bf262731f5c48a27049eae8803aa4347762d" gracePeriod=10 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301231 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-httpd-config\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301352 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-public-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301383 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301407 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgn9\" (UniqueName: \"kubernetes.io/projected/7a745438-cb17-4626-96ed-51c7de75a976-kube-api-access-2sgn9\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301438 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301467 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-combined-ca-bundle\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301489 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8hgd\" (UniqueName: \"kubernetes.io/projected/5055b057-f745-43b5-8bd2-937ed8b29743-kube-api-access-q8hgd\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301519 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a745438-cb17-4626-96ed-51c7de75a976-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301566 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-config\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301585 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301599 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301632 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-ovndb-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.301654 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-internal-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.318328 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-public-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.350227 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-httpd-config\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.354779 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-ovndb-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.355891 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8hgd\" (UniqueName: \"kubernetes.io/projected/5055b057-f745-43b5-8bd2-937ed8b29743-kube-api-access-q8hgd\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.366896 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-combined-ca-bundle\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.392006 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-477j6"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.393528 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-config\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.393666 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.398595 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5055b057-f745-43b5-8bd2-937ed8b29743-internal-tls-certs\") pod \"neutron-5694fd5cb9-r8nzz\" (UID: \"5055b057-f745-43b5-8bd2-937ed8b29743\") " pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.411660 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-477j6"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.414170 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.414221 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgn9\" (UniqueName: \"kubernetes.io/projected/7a745438-cb17-4626-96ed-51c7de75a976-kube-api-access-2sgn9\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.414261 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.414300 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a745438-cb17-4626-96ed-51c7de75a976-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.414366 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.414391 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.422672 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a745438-cb17-4626-96ed-51c7de75a976-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.431013 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.451808 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.475340 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgn9\" (UniqueName: \"kubernetes.io/projected/7a745438-cb17-4626-96ed-51c7de75a976-kube-api-access-2sgn9\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.476622 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.476881 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.493013 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.510112 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.512633 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.512751 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.516706 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.516741 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.517893 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-config\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.517955 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.517983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.518011 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xz55\" (UniqueName: \"kubernetes.io/projected/711130ba-065d-49aa-92cd-f687292f1674-kube-api-access-9xz55\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.518057 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.518087 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.622676 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.622942 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f0a0921-723c-4964-86e6-f69c2c44ca41-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.622967 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.622982 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0a0921-723c-4964-86e6-f69c2c44ca41-logs\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623003 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623033 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xz55\" (UniqueName: \"kubernetes.io/projected/711130ba-065d-49aa-92cd-f687292f1674-kube-api-access-9xz55\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623051 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623095 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623163 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623207 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623228 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7rrv\" (UniqueName: \"kubernetes.io/projected/8f0a0921-723c-4964-86e6-f69c2c44ca41-kube-api-access-h7rrv\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623263 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-scripts\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.623296 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-config\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.624460 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-config\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.624534 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.625337 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.625366 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.625872 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.649739 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xz55\" (UniqueName: \"kubernetes.io/projected/711130ba-065d-49aa-92cd-f687292f1674-kube-api-access-9xz55\") pod \"dnsmasq-dns-6bb4fc677f-477j6\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.724995 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.725050 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7rrv\" (UniqueName: \"kubernetes.io/projected/8f0a0921-723c-4964-86e6-f69c2c44ca41-kube-api-access-h7rrv\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.725114 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-scripts\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.725163 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.725196 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f0a0921-723c-4964-86e6-f69c2c44ca41-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.725210 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0a0921-723c-4964-86e6-f69c2c44ca41-logs\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.725247 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.732215 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f0a0921-723c-4964-86e6-f69c2c44ca41-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.733471 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0a0921-723c-4964-86e6-f69c2c44ca41-logs\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.741233 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.741672 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.742193 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.759436 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-scripts\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.802328 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7rrv\" (UniqueName: \"kubernetes.io/projected/8f0a0921-723c-4964-86e6-f69c2c44ca41-kube-api-access-h7rrv\") pod \"cinder-api-0\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " pod="openstack/cinder-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.825148 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"cf474a1d-5a57-4669-8ced-7d0c8decbd70","Type":"ContainerStarted","Data":"fb81662267ce1cf4e057916fd1de3178ec3f97b00be7448397efc251f609528b"} Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.826663 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.827987 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="cf474a1d-5a57-4669-8ced-7d0c8decbd70" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.190:9322/\": dial tcp 10.217.0.190:9322: connect: connection refused" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.855257 4886 generic.go:334] "Generic (PLEG): container finished" podID="777c1eca-e276-4de3-9523-ee72e0891b05" containerID="7a836cec3b9c03df1808d537afe3bf262731f5c48a27049eae8803aa4347762d" exitCode=0 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.855319 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" event={"ID":"777c1eca-e276-4de3-9523-ee72e0891b05","Type":"ContainerDied","Data":"7a836cec3b9c03df1808d537afe3bf262731f5c48a27049eae8803aa4347762d"} Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.855745 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=12.855735176 podStartE2EDuration="12.855735176s" podCreationTimestamp="2026-03-14 08:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:31.854741148 +0000 UTC m=+1367.103192785" watchObservedRunningTime="2026-03-14 08:50:31.855735176 +0000 UTC m=+1367.104186803" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.893518 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8557ccd47-58ztp" event={"ID":"42ce024b-4e1d-4f45-9faa-5f637e5a8466","Type":"ContainerStarted","Data":"954dffc535fa8adad7de9fdaf8b3727b53e7f050b6760195928588c50f477a55"} Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.929730 4886 generic.go:334] "Generic (PLEG): container finished" podID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerID="9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce" exitCode=0 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.929778 4886 generic.go:334] "Generic (PLEG): container finished" podID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerID="367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869" exitCode=2 Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.929823 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerDied","Data":"9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce"} Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.929850 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerDied","Data":"367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869"} Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.936256 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8557ccd47-58ztp" podStartSLOduration=13.243285757 podStartE2EDuration="16.936238851s" podCreationTimestamp="2026-03-14 08:50:15 +0000 UTC" firstStartedPulling="2026-03-14 08:50:18.842229302 +0000 UTC m=+1354.090680929" lastFinishedPulling="2026-03-14 08:50:22.535182386 +0000 UTC m=+1357.783634023" observedRunningTime="2026-03-14 08:50:31.931232909 +0000 UTC m=+1367.179684556" watchObservedRunningTime="2026-03-14 08:50:31.936238851 +0000 UTC m=+1367.184690488" Mar 14 08:50:31 crc kubenswrapper[4886]: I0314 08:50:31.977064 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6c468c99-bvnb4" event={"ID":"1e54e03a-ce8b-4f7d-a664-cc11daa6c786","Type":"ContainerStarted","Data":"1bbdf1722fea1dfaef2ac5e6318ccb7639dae43c488b0857ddd5a8ddbf9807c9"} Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:31.998748 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5c6c468c99-bvnb4"] Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.029237 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-756657585d-2x84b" event={"ID":"46e268aa-326d-42d7-936d-3e4d120dfeb6","Type":"ContainerStarted","Data":"a61a5adb089a50a78ca6936e3ebd05166bbf036d3542eb8076f6f0c3f5248321"} Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.062574 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5c6c468c99-bvnb4" podStartSLOduration=6.545236419 podStartE2EDuration="18.062559688s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="2026-03-14 08:50:16.899812095 +0000 UTC m=+1352.148263732" lastFinishedPulling="2026-03-14 08:50:28.417135374 +0000 UTC m=+1363.665587001" observedRunningTime="2026-03-14 08:50:32.02919456 +0000 UTC m=+1367.277646197" watchObservedRunningTime="2026-03-14 08:50:32.062559688 +0000 UTC m=+1367.311011325" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.073656 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-756657585d-2x84b" podStartSLOduration=13.161738774 podStartE2EDuration="17.073635922s" podCreationTimestamp="2026-03-14 08:50:15 +0000 UTC" firstStartedPulling="2026-03-14 08:50:18.570539653 +0000 UTC m=+1353.818991290" lastFinishedPulling="2026-03-14 08:50:22.482436801 +0000 UTC m=+1357.730888438" observedRunningTime="2026-03-14 08:50:32.059619314 +0000 UTC m=+1367.308070951" watchObservedRunningTime="2026-03-14 08:50:32.073635922 +0000 UTC m=+1367.322087559" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.107739 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-d4575995d-lfmv5"] Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.333504 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.347107 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.379716 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.449294 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5694fd5cb9-r8nzz"] Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.452916 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-config\") pod \"777c1eca-e276-4de3-9523-ee72e0891b05\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.452967 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-svc\") pod \"777c1eca-e276-4de3-9523-ee72e0891b05\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.453034 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x84v\" (UniqueName: \"kubernetes.io/projected/777c1eca-e276-4de3-9523-ee72e0891b05-kube-api-access-2x84v\") pod \"777c1eca-e276-4de3-9523-ee72e0891b05\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.453087 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-nb\") pod \"777c1eca-e276-4de3-9523-ee72e0891b05\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.453307 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-swift-storage-0\") pod \"777c1eca-e276-4de3-9523-ee72e0891b05\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.453383 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-sb\") pod \"777c1eca-e276-4de3-9523-ee72e0891b05\" (UID: \"777c1eca-e276-4de3-9523-ee72e0891b05\") " Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.500529 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777c1eca-e276-4de3-9523-ee72e0891b05-kube-api-access-2x84v" (OuterVolumeSpecName: "kube-api-access-2x84v") pod "777c1eca-e276-4de3-9523-ee72e0891b05" (UID: "777c1eca-e276-4de3-9523-ee72e0891b05"). InnerVolumeSpecName "kube-api-access-2x84v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.555048 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.556259 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x84v\" (UniqueName: \"kubernetes.io/projected/777c1eca-e276-4de3-9523-ee72e0891b05-kube-api-access-2x84v\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.564378 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "777c1eca-e276-4de3-9523-ee72e0891b05" (UID: "777c1eca-e276-4de3-9523-ee72e0891b05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.589809 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "777c1eca-e276-4de3-9523-ee72e0891b05" (UID: "777c1eca-e276-4de3-9523-ee72e0891b05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.590911 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "777c1eca-e276-4de3-9523-ee72e0891b05" (UID: "777c1eca-e276-4de3-9523-ee72e0891b05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.592774 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-config" (OuterVolumeSpecName: "config") pod "777c1eca-e276-4de3-9523-ee72e0891b05" (UID: "777c1eca-e276-4de3-9523-ee72e0891b05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:32 crc kubenswrapper[4886]: W0314 08:50:32.642787 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a745438_cb17_4626_96ed_51c7de75a976.slice/crio-34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf WatchSource:0}: Error finding container 34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf: Status 404 returned error can't find the container with id 34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.660445 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.660476 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.660486 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.660497 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.730717 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "777c1eca-e276-4de3-9523-ee72e0891b05" (UID: "777c1eca-e276-4de3-9523-ee72e0891b05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.766463 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/777c1eca-e276-4de3-9523-ee72e0891b05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:32 crc kubenswrapper[4886]: I0314 08:50:32.969238 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076095 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-sg-core-conf-yaml\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076503 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-log-httpd\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076526 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-run-httpd\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076551 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-combined-ca-bundle\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076588 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj2b4\" (UniqueName: \"kubernetes.io/projected/a9eb9137-a021-4ea6-a4a4-871cf81af732-kube-api-access-nj2b4\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076667 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-config-data\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.076715 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-scripts\") pod \"a9eb9137-a021-4ea6-a4a4-871cf81af732\" (UID: \"a9eb9137-a021-4ea6-a4a4-871cf81af732\") " Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.080484 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.084796 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.085057 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-scripts" (OuterVolumeSpecName: "scripts") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.097726 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9eb9137-a021-4ea6-a4a4-871cf81af732-kube-api-access-nj2b4" (OuterVolumeSpecName: "kube-api-access-nj2b4") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "kube-api-access-nj2b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.138683 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-477j6"] Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.156568 4886 generic.go:334] "Generic (PLEG): container finished" podID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerID="c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d" exitCode=0 Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.156724 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerDied","Data":"c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.156810 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a9eb9137-a021-4ea6-a4a4-871cf81af732","Type":"ContainerDied","Data":"e5049349eabc89c825bd59285fb1f4d53dce6f041339804f18c3a3d3c39ca606"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.156898 4886 scope.go:117] "RemoveContainer" containerID="9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.157169 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.179393 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj2b4\" (UniqueName: \"kubernetes.io/projected/a9eb9137-a021-4ea6-a4a4-871cf81af732-kube-api-access-nj2b4\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.179551 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.179627 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.179685 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a9eb9137-a021-4ea6-a4a4-871cf81af732-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.181631 4886 generic.go:334] "Generic (PLEG): container finished" podID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerID="34737c15ff4a6a3340677a2fd809063ddbd01fc9f3a607e718c5b2480379f86d" exitCode=0 Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.181750 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5789dc8f-4vv5f" event={"ID":"9f9329a3-9a35-49a2-86ce-435b98d280f3","Type":"ContainerDied","Data":"34737c15ff4a6a3340677a2fd809063ddbd01fc9f3a607e718c5b2480379f86d"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.199293 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.199337 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7496c4d65c-dg8pn" event={"ID":"6624fe29-e15e-4474-a2d9-37489c04e1b6","Type":"ContainerDied","Data":"135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.199313 4886 generic.go:334] "Generic (PLEG): container finished" podID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerID="135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14" exitCode=137 Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.208564 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.216284 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a745438-cb17-4626-96ed-51c7de75a976","Type":"ContainerStarted","Data":"34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.252327 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.254513 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" event={"ID":"7a298745-dd74-4ed3-b21b-648f2adb47dc","Type":"ContainerStarted","Data":"eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.266297 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7c5789dc8f-4vv5f" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9696/\": dial tcp 10.217.0.175:9696: connect: connection refused" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.266794 4886 scope.go:117] "RemoveContainer" containerID="367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.281621 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.281648 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.282286 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5694fd5cb9-r8nzz" event={"ID":"5055b057-f745-43b5-8bd2-937ed8b29743","Type":"ContainerStarted","Data":"d9fbfb939f20ab3e6375233a3262f230c6fd16369a52c9da11c2399e30bfe115"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.308457 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.310570 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-4bfks" event={"ID":"777c1eca-e276-4de3-9523-ee72e0891b05","Type":"ContainerDied","Data":"9bced53d6587b1ff9321d634cde46142add0bf3c3b4269e123eeb54c4971c11a"} Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.319034 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" podStartSLOduration=7.796478935 podStartE2EDuration="19.319008288s" podCreationTimestamp="2026-03-14 08:50:14 +0000 UTC" firstStartedPulling="2026-03-14 08:50:16.910049309 +0000 UTC m=+1352.158500946" lastFinishedPulling="2026-03-14 08:50:28.432578642 +0000 UTC m=+1363.681030299" observedRunningTime="2026-03-14 08:50:33.298689771 +0000 UTC m=+1368.547141408" watchObservedRunningTime="2026-03-14 08:50:33.319008288 +0000 UTC m=+1368.567459925" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.365113 4886 scope.go:117] "RemoveContainer" containerID="c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.452457 4886 scope.go:117] "RemoveContainer" containerID="9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce" Mar 14 08:50:33 crc kubenswrapper[4886]: E0314 08:50:33.453304 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce\": container with ID starting with 9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce not found: ID does not exist" containerID="9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.453338 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce"} err="failed to get container status \"9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce\": rpc error: code = NotFound desc = could not find container \"9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce\": container with ID starting with 9522fad26482fa71d5234f7635d40931c03b9bdf32d02478e001e4b872eac5ce not found: ID does not exist" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.453363 4886 scope.go:117] "RemoveContainer" containerID="367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.457591 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4bfks"] Mar 14 08:50:33 crc kubenswrapper[4886]: E0314 08:50:33.457728 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869\": container with ID starting with 367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869 not found: ID does not exist" containerID="367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.457772 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869"} err="failed to get container status \"367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869\": rpc error: code = NotFound desc = could not find container \"367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869\": container with ID starting with 367f53ab04c0e461f4791a68acf7e0404c669ac72803a129efad6ed8cc70f869 not found: ID does not exist" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.457804 4886 scope.go:117] "RemoveContainer" containerID="c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.458035 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-4bfks"] Mar 14 08:50:33 crc kubenswrapper[4886]: E0314 08:50:33.474275 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d\": container with ID starting with c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d not found: ID does not exist" containerID="c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.474333 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d"} err="failed to get container status \"c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d\": rpc error: code = NotFound desc = could not find container \"c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d\": container with ID starting with c8926f738347aa1d97166c7961400e4ac904b525b9b82069b40c04bfe819735d not found: ID does not exist" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.474366 4886 scope.go:117] "RemoveContainer" containerID="7a836cec3b9c03df1808d537afe3bf262731f5c48a27049eae8803aa4347762d" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.576392 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:33 crc kubenswrapper[4886]: I0314 08:50:33.773571 4886 scope.go:117] "RemoveContainer" containerID="b72ccf6553947605f5774dcf6cf52a379782aa9a503fecbf75403c346a256df5" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.081367 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-config-data" (OuterVolumeSpecName: "config-data") pod "a9eb9137-a021-4ea6-a4a4-871cf81af732" (UID: "a9eb9137-a021-4ea6-a4a4-871cf81af732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.149789 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9eb9137-a021-4ea6-a4a4-871cf81af732-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.500158 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.566310 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-config-data\") pod \"6624fe29-e15e-4474-a2d9-37489c04e1b6\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.566415 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-scripts\") pod \"6624fe29-e15e-4474-a2d9-37489c04e1b6\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.566552 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6624fe29-e15e-4474-a2d9-37489c04e1b6-horizon-secret-key\") pod \"6624fe29-e15e-4474-a2d9-37489c04e1b6\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.566583 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85kdb\" (UniqueName: \"kubernetes.io/projected/6624fe29-e15e-4474-a2d9-37489c04e1b6-kube-api-access-85kdb\") pod \"6624fe29-e15e-4474-a2d9-37489c04e1b6\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.566718 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6624fe29-e15e-4474-a2d9-37489c04e1b6-logs\") pod \"6624fe29-e15e-4474-a2d9-37489c04e1b6\" (UID: \"6624fe29-e15e-4474-a2d9-37489c04e1b6\") " Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.575281 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6624fe29-e15e-4474-a2d9-37489c04e1b6-logs" (OuterVolumeSpecName: "logs") pod "6624fe29-e15e-4474-a2d9-37489c04e1b6" (UID: "6624fe29-e15e-4474-a2d9-37489c04e1b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.599515 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6624fe29-e15e-4474-a2d9-37489c04e1b6-kube-api-access-85kdb" (OuterVolumeSpecName: "kube-api-access-85kdb") pod "6624fe29-e15e-4474-a2d9-37489c04e1b6" (UID: "6624fe29-e15e-4474-a2d9-37489c04e1b6"). InnerVolumeSpecName "kube-api-access-85kdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.608468 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-scripts" (OuterVolumeSpecName: "scripts") pod "6624fe29-e15e-4474-a2d9-37489c04e1b6" (UID: "6624fe29-e15e-4474-a2d9-37489c04e1b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.619500 4886 generic.go:334] "Generic (PLEG): container finished" podID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerID="5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63" exitCode=137 Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.619897 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7496c4d65c-dg8pn" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.619990 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7496c4d65c-dg8pn" event={"ID":"6624fe29-e15e-4474-a2d9-37489c04e1b6","Type":"ContainerDied","Data":"5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63"} Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.620022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7496c4d65c-dg8pn" event={"ID":"6624fe29-e15e-4474-a2d9-37489c04e1b6","Type":"ContainerDied","Data":"a11d66d46dc65e4e38510e21078f86e2daf413f71c76931aa4b7cc33c8704854"} Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.620039 4886 scope.go:117] "RemoveContainer" containerID="5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.631032 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6624fe29-e15e-4474-a2d9-37489c04e1b6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6624fe29-e15e-4474-a2d9-37489c04e1b6" (UID: "6624fe29-e15e-4474-a2d9-37489c04e1b6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.650758 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f0a0921-723c-4964-86e6-f69c2c44ca41","Type":"ContainerStarted","Data":"064f31c3ef8894655d1050ab693a30bce4736e21dbe7a04ca8baded92d6c7ad6"} Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.662513 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" event={"ID":"711130ba-065d-49aa-92cd-f687292f1674","Type":"ContainerStarted","Data":"3f5def5dd1a0b6f20ce330368dddd14621c18100a543e8f7a925fed970b85dee"} Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.664874 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-config-data" (OuterVolumeSpecName: "config-data") pod "6624fe29-e15e-4474-a2d9-37489c04e1b6" (UID: "6624fe29-e15e-4474-a2d9-37489c04e1b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.677886 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener-log" containerID="cri-o://a774d7ecff810007d347300dd5019b4603ff3a9eff33e055d7e88e41b6cbfc9e" gracePeriod=30 Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.678138 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5694fd5cb9-r8nzz" event={"ID":"5055b057-f745-43b5-8bd2-937ed8b29743","Type":"ContainerStarted","Data":"7e0082d4623a8cc58cf961d806586d54876e5ed22672834f3517f0a0542d7c91"} Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.678255 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5c6c468c99-bvnb4" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker-log" containerID="cri-o://e7961e21a3760933ecc66f9afc836f2931d46adf93c83f5459e2aedfe0f1cd2a" gracePeriod=30 Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.678304 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5c6c468c99-bvnb4" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker" containerID="cri-o://1bbdf1722fea1dfaef2ac5e6318ccb7639dae43c488b0857ddd5a8ddbf9807c9" gracePeriod=30 Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.678215 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener" containerID="cri-o://eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014" gracePeriod=30 Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.692514 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6624fe29-e15e-4474-a2d9-37489c04e1b6-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.693377 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.693393 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6624fe29-e15e-4474-a2d9-37489c04e1b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.693413 4886 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6624fe29-e15e-4474-a2d9-37489c04e1b6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.693426 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85kdb\" (UniqueName: \"kubernetes.io/projected/6624fe29-e15e-4474-a2d9-37489c04e1b6-kube-api-access-85kdb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.697341 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.727292 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.823810 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.826498 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon-log" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.826527 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon-log" Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.826549 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="proxy-httpd" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827157 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="proxy-httpd" Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.827184 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827194 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon" Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.827204 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" containerName="init" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827210 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" containerName="init" Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.827218 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="ceilometer-notification-agent" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827223 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="ceilometer-notification-agent" Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.827233 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" containerName="dnsmasq-dns" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827240 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" containerName="dnsmasq-dns" Mar 14 08:50:34 crc kubenswrapper[4886]: E0314 08:50:34.827256 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="sg-core" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827262 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="sg-core" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827871 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon-log" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827889 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" containerName="dnsmasq-dns" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827899 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" containerName="horizon" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827930 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="ceilometer-notification-agent" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827946 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="proxy-httpd" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.827956 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" containerName="sg-core" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.834025 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.839560 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.839765 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.856157 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.905991 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-config-data\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.906033 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.906105 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.906247 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxtdt\" (UniqueName: \"kubernetes.io/projected/3e151f3d-c603-4f97-92ba-a079b9a5ad49-kube-api-access-kxtdt\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.906424 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-log-httpd\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.906612 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-run-httpd\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.906705 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-scripts\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.990880 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7496c4d65c-dg8pn"] Mar 14 08:50:34 crc kubenswrapper[4886]: I0314 08:50:34.999497 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7496c4d65c-dg8pn"] Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.008956 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009018 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxtdt\" (UniqueName: \"kubernetes.io/projected/3e151f3d-c603-4f97-92ba-a079b9a5ad49-kube-api-access-kxtdt\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009070 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-log-httpd\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009425 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-run-httpd\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009476 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-scripts\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009545 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-config-data\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009566 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.009694 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-log-httpd\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.010392 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-run-httpd\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.015987 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.018332 4886 scope.go:117] "RemoveContainer" containerID="135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.018803 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.025594 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-scripts\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.026340 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-config-data\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.034452 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxtdt\" (UniqueName: \"kubernetes.io/projected/3e151f3d-c603-4f97-92ba-a079b9a5ad49-kube-api-access-kxtdt\") pod \"ceilometer-0\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.173764 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.246469 4886 scope.go:117] "RemoveContainer" containerID="5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63" Mar 14 08:50:35 crc kubenswrapper[4886]: E0314 08:50:35.269572 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63\": container with ID starting with 5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63 not found: ID does not exist" containerID="5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.269631 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63"} err="failed to get container status \"5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63\": rpc error: code = NotFound desc = could not find container \"5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63\": container with ID starting with 5ac68317c315960ec899ae3f37574d3591926ff9d8914ecc920ad7a5710d5e63 not found: ID does not exist" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.269661 4886 scope.go:117] "RemoveContainer" containerID="135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14" Mar 14 08:50:35 crc kubenswrapper[4886]: E0314 08:50:35.309312 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14\": container with ID starting with 135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14 not found: ID does not exist" containerID="135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.309398 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14"} err="failed to get container status \"135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14\": rpc error: code = NotFound desc = could not find container \"135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14\": container with ID starting with 135ef659ef7adb5ba1080cffc593d8a86b8615ce3f878ff99f49cfd0a0acff14 not found: ID does not exist" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.335649 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.405681 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.443423 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6624fe29-e15e-4474-a2d9-37489c04e1b6" path="/var/lib/kubelet/pods/6624fe29-e15e-4474-a2d9-37489c04e1b6/volumes" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.444555 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777c1eca-e276-4de3-9523-ee72e0891b05" path="/var/lib/kubelet/pods/777c1eca-e276-4de3-9523-ee72e0891b05/volumes" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.445554 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9eb9137-a021-4ea6-a4a4-871cf81af732" path="/var/lib/kubelet/pods/a9eb9137-a021-4ea6-a4a4-871cf81af732/volumes" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.723501 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5694fd5cb9-r8nzz" event={"ID":"5055b057-f745-43b5-8bd2-937ed8b29743","Type":"ContainerStarted","Data":"c448c75b64d57262c74ba8a6f06e09383c05dd98ecac3da89d529d50fff017ed"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.723993 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.727933 4886 generic.go:334] "Generic (PLEG): container finished" podID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerID="ddfe2022d1041feed4b3664b5ffa4eac881559b738c1046c39be568571c63d8f" exitCode=0 Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.727985 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5789dc8f-4vv5f" event={"ID":"9f9329a3-9a35-49a2-86ce-435b98d280f3","Type":"ContainerDied","Data":"ddfe2022d1041feed4b3664b5ffa4eac881559b738c1046c39be568571c63d8f"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.757441 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5694fd5cb9-r8nzz" podStartSLOduration=4.757416385 podStartE2EDuration="4.757416385s" podCreationTimestamp="2026-03-14 08:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:35.743698556 +0000 UTC m=+1370.992150193" watchObservedRunningTime="2026-03-14 08:50:35.757416385 +0000 UTC m=+1371.005868022" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.758648 4886 generic.go:334] "Generic (PLEG): container finished" podID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerID="1bbdf1722fea1dfaef2ac5e6318ccb7639dae43c488b0857ddd5a8ddbf9807c9" exitCode=0 Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.758681 4886 generic.go:334] "Generic (PLEG): container finished" podID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerID="e7961e21a3760933ecc66f9afc836f2931d46adf93c83f5459e2aedfe0f1cd2a" exitCode=143 Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.758725 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6c468c99-bvnb4" event={"ID":"1e54e03a-ce8b-4f7d-a664-cc11daa6c786","Type":"ContainerDied","Data":"1bbdf1722fea1dfaef2ac5e6318ccb7639dae43c488b0857ddd5a8ddbf9807c9"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.758752 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6c468c99-bvnb4" event={"ID":"1e54e03a-ce8b-4f7d-a664-cc11daa6c786","Type":"ContainerDied","Data":"e7961e21a3760933ecc66f9afc836f2931d46adf93c83f5459e2aedfe0f1cd2a"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.761395 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.767841 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a745438-cb17-4626-96ed-51c7de75a976","Type":"ContainerStarted","Data":"a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.774204 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerID="a774d7ecff810007d347300dd5019b4603ff3a9eff33e055d7e88e41b6cbfc9e" exitCode=143 Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.774276 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" event={"ID":"7a298745-dd74-4ed3-b21b-648f2adb47dc","Type":"ContainerDied","Data":"a774d7ecff810007d347300dd5019b4603ff3a9eff33e055d7e88e41b6cbfc9e"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.789741 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f0a0921-723c-4964-86e6-f69c2c44ca41","Type":"ContainerStarted","Data":"3fbead376e9159365c2ca47876c14a0d4dd1edf86bd8a8d952d4ed35ae29a352"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836031 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xlzx\" (UniqueName: \"kubernetes.io/projected/9f9329a3-9a35-49a2-86ce-435b98d280f3-kube-api-access-2xlzx\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836480 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-combined-ca-bundle\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836514 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-httpd-config\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836536 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-internal-tls-certs\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836628 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-public-tls-certs\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836673 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-ovndb-tls-certs\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.836704 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-config\") pod \"9f9329a3-9a35-49a2-86ce-435b98d280f3\" (UID: \"9f9329a3-9a35-49a2-86ce-435b98d280f3\") " Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.841912 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.845604 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9329a3-9a35-49a2-86ce-435b98d280f3-kube-api-access-2xlzx" (OuterVolumeSpecName: "kube-api-access-2xlzx") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "kube-api-access-2xlzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.861413 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.899540 4886 generic.go:334] "Generic (PLEG): container finished" podID="711130ba-065d-49aa-92cd-f687292f1674" containerID="eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8" exitCode=0 Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.899677 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.901757 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" event={"ID":"711130ba-065d-49aa-92cd-f687292f1674","Type":"ContainerDied","Data":"eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8"} Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.961514 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xlzx\" (UniqueName: \"kubernetes.io/projected/9f9329a3-9a35-49a2-86ce-435b98d280f3-kube-api-access-2xlzx\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:35 crc kubenswrapper[4886]: I0314 08:50:35.961546 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.072482 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.133359 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.197576 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.205903 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.205948 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.205963 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.227419 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.308066 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data\") pod \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.308109 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-logs\") pod \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.308152 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle\") pod \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.308304 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phdg4\" (UniqueName: \"kubernetes.io/projected/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-kube-api-access-phdg4\") pod \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.308363 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data-custom\") pod \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.311290 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-logs" (OuterVolumeSpecName: "logs") pod "1e54e03a-ce8b-4f7d-a664-cc11daa6c786" (UID: "1e54e03a-ce8b-4f7d-a664-cc11daa6c786"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.315321 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-config" (OuterVolumeSpecName: "config") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.331233 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1e54e03a-ce8b-4f7d-a664-cc11daa6c786" (UID: "1e54e03a-ce8b-4f7d-a664-cc11daa6c786"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.382442 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-kube-api-access-phdg4" (OuterVolumeSpecName: "kube-api-access-phdg4") pod "1e54e03a-ce8b-4f7d-a664-cc11daa6c786" (UID: "1e54e03a-ce8b-4f7d-a664-cc11daa6c786"). InnerVolumeSpecName "kube-api-access-phdg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.409306 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e54e03a-ce8b-4f7d-a664-cc11daa6c786" (UID: "1e54e03a-ce8b-4f7d-a664-cc11daa6c786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.410193 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle\") pod \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\" (UID: \"1e54e03a-ce8b-4f7d-a664-cc11daa6c786\") " Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.410758 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.410776 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.410788 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phdg4\" (UniqueName: \"kubernetes.io/projected/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-kube-api-access-phdg4\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.410801 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: W0314 08:50:36.410889 4886 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1e54e03a-ce8b-4f7d-a664-cc11daa6c786/volumes/kubernetes.io~secret/combined-ca-bundle Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.410901 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e54e03a-ce8b-4f7d-a664-cc11daa6c786" (UID: "1e54e03a-ce8b-4f7d-a664-cc11daa6c786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.424354 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9f9329a3-9a35-49a2-86ce-435b98d280f3" (UID: "9f9329a3-9a35-49a2-86ce-435b98d280f3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.488361 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data" (OuterVolumeSpecName: "config-data") pod "1e54e03a-ce8b-4f7d-a664-cc11daa6c786" (UID: "1e54e03a-ce8b-4f7d-a664-cc11daa6c786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.507185 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.512704 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.512729 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54e03a-ce8b-4f7d-a664-cc11daa6c786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.512754 4886 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9329a3-9a35-49a2-86ce-435b98d280f3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:36 crc kubenswrapper[4886]: W0314 08:50:36.584968 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e151f3d_c603_4f97_92ba_a079b9a5ad49.slice/crio-45bd4230080c0567cadfbe0d0d8a34fc639ae001052947c532822a38c13c70bc WatchSource:0}: Error finding container 45bd4230080c0567cadfbe0d0d8a34fc639ae001052947c532822a38c13c70bc: Status 404 returned error can't find the container with id 45bd4230080c0567cadfbe0d0d8a34fc639ae001052947c532822a38c13c70bc Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.931355 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5789dc8f-4vv5f" event={"ID":"9f9329a3-9a35-49a2-86ce-435b98d280f3","Type":"ContainerDied","Data":"bdde15edc1885ed8336cb76b4a687705dffefc5b3ad001c2ed65a520ba273633"} Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.932799 4886 scope.go:117] "RemoveContainer" containerID="34737c15ff4a6a3340677a2fd809063ddbd01fc9f3a607e718c5b2480379f86d" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.932280 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5789dc8f-4vv5f" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.945008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6c468c99-bvnb4" event={"ID":"1e54e03a-ce8b-4f7d-a664-cc11daa6c786","Type":"ContainerDied","Data":"6144141fc2fab83234ee0c4180769c7a492d27ca9997aa532e9154cf117e6433"} Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.945090 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c6c468c99-bvnb4" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.961380 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerStarted","Data":"45bd4230080c0567cadfbe0d0d8a34fc639ae001052947c532822a38c13c70bc"} Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.981895 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c5789dc8f-4vv5f"] Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.989596 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c5789dc8f-4vv5f"] Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.992324 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f0a0921-723c-4964-86e6-f69c2c44ca41","Type":"ContainerStarted","Data":"2029cb6fd066ddae3ca6e6efb4965355e0231952c7520f53c43aed8e700e69ef"} Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.992471 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api-log" containerID="cri-o://3fbead376e9159365c2ca47876c14a0d4dd1edf86bd8a8d952d4ed35ae29a352" gracePeriod=30 Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.992652 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 08:50:36 crc kubenswrapper[4886]: I0314 08:50:36.992689 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api" containerID="cri-o://2029cb6fd066ddae3ca6e6efb4965355e0231952c7520f53c43aed8e700e69ef" gracePeriod=30 Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.034757 4886 scope.go:117] "RemoveContainer" containerID="ddfe2022d1041feed4b3664b5ffa4eac881559b738c1046c39be568571c63d8f" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.040752 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" event={"ID":"711130ba-065d-49aa-92cd-f687292f1674","Type":"ContainerStarted","Data":"5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a"} Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.040787 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5c6c468c99-bvnb4"] Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.041924 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.079374 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5c6c468c99-bvnb4"] Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.079491 4886 scope.go:117] "RemoveContainer" containerID="1bbdf1722fea1dfaef2ac5e6318ccb7639dae43c488b0857ddd5a8ddbf9807c9" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.090722 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.090701487 podStartE2EDuration="6.090701487s" podCreationTimestamp="2026-03-14 08:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:37.056332051 +0000 UTC m=+1372.304783688" watchObservedRunningTime="2026-03-14 08:50:37.090701487 +0000 UTC m=+1372.339153124" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.099851 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" podStartSLOduration=6.099829836 podStartE2EDuration="6.099829836s" podCreationTimestamp="2026-03-14 08:50:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:37.08130237 +0000 UTC m=+1372.329754007" watchObservedRunningTime="2026-03-14 08:50:37.099829836 +0000 UTC m=+1372.348281473" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.136355 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.156531 4886 scope.go:117] "RemoveContainer" containerID="e7961e21a3760933ecc66f9afc836f2931d46adf93c83f5459e2aedfe0f1cd2a" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.459797 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" path="/var/lib/kubelet/pods/1e54e03a-ce8b-4f7d-a664-cc11daa6c786/volumes" Mar 14 08:50:37 crc kubenswrapper[4886]: I0314 08:50:37.460859 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" path="/var/lib/kubelet/pods/9f9329a3-9a35-49a2-86ce-435b98d280f3/volumes" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.081854 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a745438-cb17-4626-96ed-51c7de75a976","Type":"ContainerStarted","Data":"e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9"} Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.087522 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerStarted","Data":"dd5387f87bdc471975f7312c9c25f8508cbf38d02865eb3750cdce2f2673637d"} Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.094467 4886 generic.go:334] "Generic (PLEG): container finished" podID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerID="2029cb6fd066ddae3ca6e6efb4965355e0231952c7520f53c43aed8e700e69ef" exitCode=0 Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.094503 4886 generic.go:334] "Generic (PLEG): container finished" podID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerID="3fbead376e9159365c2ca47876c14a0d4dd1edf86bd8a8d952d4ed35ae29a352" exitCode=143 Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.094531 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f0a0921-723c-4964-86e6-f69c2c44ca41","Type":"ContainerDied","Data":"2029cb6fd066ddae3ca6e6efb4965355e0231952c7520f53c43aed8e700e69ef"} Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.094564 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f0a0921-723c-4964-86e6-f69c2c44ca41","Type":"ContainerDied","Data":"3fbead376e9159365c2ca47876c14a0d4dd1edf86bd8a8d952d4ed35ae29a352"} Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.160048 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.175863 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.186879 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.822293938 podStartE2EDuration="7.186849407s" podCreationTimestamp="2026-03-14 08:50:31 +0000 UTC" firstStartedPulling="2026-03-14 08:50:32.674279135 +0000 UTC m=+1367.922730772" lastFinishedPulling="2026-03-14 08:50:34.038834604 +0000 UTC m=+1369.287286241" observedRunningTime="2026-03-14 08:50:38.116552991 +0000 UTC m=+1373.365004628" watchObservedRunningTime="2026-03-14 08:50:38.186849407 +0000 UTC m=+1373.435301044" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279075 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7rrv\" (UniqueName: \"kubernetes.io/projected/8f0a0921-723c-4964-86e6-f69c2c44ca41-kube-api-access-h7rrv\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279160 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0a0921-723c-4964-86e6-f69c2c44ca41-logs\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279215 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f0a0921-723c-4964-86e6-f69c2c44ca41-etc-machine-id\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279366 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-scripts\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279481 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data-custom\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279523 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.279549 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-combined-ca-bundle\") pod \"8f0a0921-723c-4964-86e6-f69c2c44ca41\" (UID: \"8f0a0921-723c-4964-86e6-f69c2c44ca41\") " Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.284225 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f0a0921-723c-4964-86e6-f69c2c44ca41-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.284467 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0a0921-723c-4964-86e6-f69c2c44ca41-logs" (OuterVolumeSpecName: "logs") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.292546 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.326497 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0a0921-723c-4964-86e6-f69c2c44ca41-kube-api-access-h7rrv" (OuterVolumeSpecName: "kube-api-access-h7rrv") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "kube-api-access-h7rrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.328244 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-scripts" (OuterVolumeSpecName: "scripts") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.383019 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.383047 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.383064 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7rrv\" (UniqueName: \"kubernetes.io/projected/8f0a0921-723c-4964-86e6-f69c2c44ca41-kube-api-access-h7rrv\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.383075 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f0a0921-723c-4964-86e6-f69c2c44ca41-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.383085 4886 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f0a0921-723c-4964-86e6-f69c2c44ca41-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.395445 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data" (OuterVolumeSpecName: "config-data") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.410266 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f0a0921-723c-4964-86e6-f69c2c44ca41" (UID: "8f0a0921-723c-4964-86e6-f69c2c44ca41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.485182 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:38 crc kubenswrapper[4886]: I0314 08:50:38.485214 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0a0921-723c-4964-86e6-f69c2c44ca41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.113354 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f0a0921-723c-4964-86e6-f69c2c44ca41","Type":"ContainerDied","Data":"064f31c3ef8894655d1050ab693a30bce4736e21dbe7a04ca8baded92d6c7ad6"} Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.113774 4886 scope.go:117] "RemoveContainer" containerID="2029cb6fd066ddae3ca6e6efb4965355e0231952c7520f53c43aed8e700e69ef" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.113683 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.148733 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerStarted","Data":"c16075b44bb57be8e34f305e8246a2bec54c6a3ff58abe943614ba44ba38929c"} Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.244823 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.250669 4886 scope.go:117] "RemoveContainer" containerID="3fbead376e9159365c2ca47876c14a0d4dd1edf86bd8a8d952d4ed35ae29a352" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.257717 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.271996 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:39 crc kubenswrapper[4886]: E0314 08:50:39.272413 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api-log" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272429 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api-log" Mar 14 08:50:39 crc kubenswrapper[4886]: E0314 08:50:39.272443 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-httpd" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272448 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-httpd" Mar 14 08:50:39 crc kubenswrapper[4886]: E0314 08:50:39.272460 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-api" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272467 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-api" Mar 14 08:50:39 crc kubenswrapper[4886]: E0314 08:50:39.272484 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272490 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker" Mar 14 08:50:39 crc kubenswrapper[4886]: E0314 08:50:39.272503 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker-log" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272510 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker-log" Mar 14 08:50:39 crc kubenswrapper[4886]: E0314 08:50:39.272524 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272529 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272698 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api-log" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272709 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-httpd" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272720 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" containerName="cinder-api" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272734 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker-log" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272746 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e54e03a-ce8b-4f7d-a664-cc11daa6c786" containerName="barbican-worker" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.272758 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9329a3-9a35-49a2-86ce-435b98d280f3" containerName="neutron-api" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.273791 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.276802 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.277080 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.277250 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.288593 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.316670 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0496ade0-0884-4a88-a226-5145b6396213-logs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.321635 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-config-data-custom\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.321787 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.321902 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-config-data\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.322007 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.322094 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0496ade0-0884-4a88-a226-5145b6396213-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.322256 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.322403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-scripts\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.322618 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr4qm\" (UniqueName: \"kubernetes.io/projected/0496ade0-0884-4a88-a226-5145b6396213-kube-api-access-lr4qm\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.424452 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-scripts\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.424844 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr4qm\" (UniqueName: \"kubernetes.io/projected/0496ade0-0884-4a88-a226-5145b6396213-kube-api-access-lr4qm\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.424887 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0496ade0-0884-4a88-a226-5145b6396213-logs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.424942 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-config-data-custom\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.424981 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.425017 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-config-data\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.425046 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.425104 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0496ade0-0884-4a88-a226-5145b6396213-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.425184 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.428718 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0496ade0-0884-4a88-a226-5145b6396213-logs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.428782 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0496ade0-0884-4a88-a226-5145b6396213-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.438492 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0a0921-723c-4964-86e6-f69c2c44ca41" path="/var/lib/kubelet/pods/8f0a0921-723c-4964-86e6-f69c2c44ca41/volumes" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.440873 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.441903 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-config-data\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.444616 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-scripts\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.445020 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.445213 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-config-data-custom\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.447705 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0496ade0-0884-4a88-a226-5145b6396213-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.452675 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr4qm\" (UniqueName: \"kubernetes.io/projected/0496ade0-0884-4a88-a226-5145b6396213-kube-api-access-lr4qm\") pod \"cinder-api-0\" (UID: \"0496ade0-0884-4a88-a226-5145b6396213\") " pod="openstack/cinder-api-0" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.452958 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dc48c5bd6-xmnxc" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.526618 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fd5f87754-lf26d"] Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.526850 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fd5f87754-lf26d" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api-log" containerID="cri-o://33ef06fc60d6075241b9c87a3f787f8ce5d96bfd1cd8bd009555ebed64f9a14e" gracePeriod=30 Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.527288 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fd5f87754-lf26d" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api" containerID="cri-o://8c743e44789d946fe248b9fb8b21d0de73cd5e05715b311be36c45268e7aad22" gracePeriod=30 Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.538468 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6fd5f87754-lf26d" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": EOF" Mar 14 08:50:39 crc kubenswrapper[4886]: I0314 08:50:39.665783 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 14 08:50:40 crc kubenswrapper[4886]: I0314 08:50:40.159411 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerStarted","Data":"0cca71b50bf4f2e2ecb190fb38e893cf978a12c89e0e3f682d31807923123447"} Mar 14 08:50:40 crc kubenswrapper[4886]: I0314 08:50:40.161406 4886 generic.go:334] "Generic (PLEG): container finished" podID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerID="33ef06fc60d6075241b9c87a3f787f8ce5d96bfd1cd8bd009555ebed64f9a14e" exitCode=143 Mar 14 08:50:40 crc kubenswrapper[4886]: I0314 08:50:40.161466 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd5f87754-lf26d" event={"ID":"00edb071-fbb5-4e88-8370-a2c76ad13a6c","Type":"ContainerDied","Data":"33ef06fc60d6075241b9c87a3f787f8ce5d96bfd1cd8bd009555ebed64f9a14e"} Mar 14 08:50:40 crc kubenswrapper[4886]: I0314 08:50:40.186767 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 14 08:50:40 crc kubenswrapper[4886]: W0314 08:50:40.190067 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0496ade0_0884_4a88_a226_5145b6396213.slice/crio-cda81eb58dff85e6d95636666320347c0d12844e5f353c182802902ddff83759 WatchSource:0}: Error finding container cda81eb58dff85e6d95636666320347c0d12844e5f353c182802902ddff83759: Status 404 returned error can't find the container with id cda81eb58dff85e6d95636666320347c0d12844e5f353c182802902ddff83759 Mar 14 08:50:40 crc kubenswrapper[4886]: I0314 08:50:40.398933 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Mar 14 08:50:40 crc kubenswrapper[4886]: I0314 08:50:40.411972 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.053604 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.190432 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-config-data\") pod \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.190474 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7lx7\" (UniqueName: \"kubernetes.io/projected/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-kube-api-access-l7lx7\") pod \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.190744 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-custom-prometheus-ca\") pod \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.190796 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-logs\") pod \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.190881 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-combined-ca-bundle\") pod \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\" (UID: \"b41275dd-03d8-40b8-9f06-0dc67ecb12e6\") " Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.192850 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-logs" (OuterVolumeSpecName: "logs") pod "b41275dd-03d8-40b8-9f06-0dc67ecb12e6" (UID: "b41275dd-03d8-40b8-9f06-0dc67ecb12e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.195212 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.200280 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-kube-api-access-l7lx7" (OuterVolumeSpecName: "kube-api-access-l7lx7") pod "b41275dd-03d8-40b8-9f06-0dc67ecb12e6" (UID: "b41275dd-03d8-40b8-9f06-0dc67ecb12e6"). InnerVolumeSpecName "kube-api-access-l7lx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.218252 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0496ade0-0884-4a88-a226-5145b6396213","Type":"ContainerStarted","Data":"7cd69311d9466a9f0ce1da3c32a298fedf35105f1cf4a5838f42b421aaf4218b"} Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.218318 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0496ade0-0884-4a88-a226-5145b6396213","Type":"ContainerStarted","Data":"cda81eb58dff85e6d95636666320347c0d12844e5f353c182802902ddff83759"} Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.224910 4886 generic.go:334] "Generic (PLEG): container finished" podID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" exitCode=137 Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.225427 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b41275dd-03d8-40b8-9f06-0dc67ecb12e6","Type":"ContainerDied","Data":"e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6"} Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.225495 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b41275dd-03d8-40b8-9f06-0dc67ecb12e6","Type":"ContainerDied","Data":"3218e5ea026ad53abd6ba2970afb27d44ad4d20400d9589d591ae4d8fdac89a6"} Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.225514 4886 scope.go:117] "RemoveContainer" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.228846 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.245595 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b41275dd-03d8-40b8-9f06-0dc67ecb12e6" (UID: "b41275dd-03d8-40b8-9f06-0dc67ecb12e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.246250 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b41275dd-03d8-40b8-9f06-0dc67ecb12e6" (UID: "b41275dd-03d8-40b8-9f06-0dc67ecb12e6"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.250026 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.297585 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-config-data" (OuterVolumeSpecName: "config-data") pod "b41275dd-03d8-40b8-9f06-0dc67ecb12e6" (UID: "b41275dd-03d8-40b8-9f06-0dc67ecb12e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.306536 4886 scope.go:117] "RemoveContainer" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" Mar 14 08:50:41 crc kubenswrapper[4886]: E0314 08:50:41.307580 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6\": container with ID starting with e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6 not found: ID does not exist" containerID="e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.307614 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6"} err="failed to get container status \"e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6\": rpc error: code = NotFound desc = could not find container \"e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6\": container with ID starting with e1778bedcc348d4d67175df99f9347a63ee4172f4ef0229c5b4f4bc1b3a5f4d6 not found: ID does not exist" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.307728 4886 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.307761 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.307773 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.307784 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7lx7\" (UniqueName: \"kubernetes.io/projected/b41275dd-03d8-40b8-9f06-0dc67ecb12e6-kube-api-access-l7lx7\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.517349 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.561165 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.588728 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.650272 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:50:41 crc kubenswrapper[4886]: E0314 08:50:41.650949 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" containerName="watcher-decision-engine" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.650973 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" containerName="watcher-decision-engine" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.651308 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" containerName="watcher-decision-engine" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.652337 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.656610 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.662917 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.726653 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.726849 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.726987 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.727162 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fgt\" (UniqueName: \"kubernetes.io/projected/cf9e5273-6d41-439e-98a2-263c64a3b39b-kube-api-access-76fgt\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.727296 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9e5273-6d41-439e-98a2-263c64a3b39b-logs\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.778324 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.828858 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.828915 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.828934 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.828962 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76fgt\" (UniqueName: \"kubernetes.io/projected/cf9e5273-6d41-439e-98a2-263c64a3b39b-kube-api-access-76fgt\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.828998 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9e5273-6d41-439e-98a2-263c64a3b39b-logs\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.829558 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9e5273-6d41-439e-98a2-263c64a3b39b-logs\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.833390 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.833465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.834357 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.837865 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.837940 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.849370 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fgt\" (UniqueName: \"kubernetes.io/projected/cf9e5273-6d41-439e-98a2-263c64a3b39b-kube-api-access-76fgt\") pod \"watcher-decision-engine-0\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:50:41 crc kubenswrapper[4886]: I0314 08:50:41.975419 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.257889 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0496ade0-0884-4a88-a226-5145b6396213","Type":"ContainerStarted","Data":"5f60beb89cb235d2abf8126dbecff51c72c6ef7c3dc40d936af3590210c08afd"} Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.260005 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.267499 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerStarted","Data":"d1b23c5bf8ac377569ba717f02febd51f3743e72ecf41af7708420755eb691cd"} Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.268919 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.339255 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.345834 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.34580595 podStartE2EDuration="3.34580595s" podCreationTimestamp="2026-03-14 08:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:42.301187214 +0000 UTC m=+1377.549638851" watchObservedRunningTime="2026-03-14 08:50:42.34580595 +0000 UTC m=+1377.594257587" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.417476 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.431059 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.87346638 podStartE2EDuration="8.43102747s" podCreationTimestamp="2026-03-14 08:50:34 +0000 UTC" firstStartedPulling="2026-03-14 08:50:36.594241942 +0000 UTC m=+1371.842693579" lastFinishedPulling="2026-03-14 08:50:41.151803042 +0000 UTC m=+1376.400254669" observedRunningTime="2026-03-14 08:50:42.37255617 +0000 UTC m=+1377.621007807" watchObservedRunningTime="2026-03-14 08:50:42.43102747 +0000 UTC m=+1377.679479107" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.456702 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-m899h"] Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.456940 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" containerName="dnsmasq-dns" containerID="cri-o://92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9" gracePeriod=10 Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.719626 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.982816 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fd5f87754-lf26d" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:39740->10.217.0.187:9311: read: connection reset by peer" Mar 14 08:50:42 crc kubenswrapper[4886]: I0314 08:50:42.983216 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fd5f87754-lf26d" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:39726->10.217.0.187:9311: read: connection reset by peer" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.175751 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.261508 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-config\") pod \"9a3ebe21-0432-4b40-8e80-5369de346831\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.261671 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-nb\") pod \"9a3ebe21-0432-4b40-8e80-5369de346831\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.261707 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-svc\") pod \"9a3ebe21-0432-4b40-8e80-5369de346831\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.261750 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-sb\") pod \"9a3ebe21-0432-4b40-8e80-5369de346831\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.262030 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-swift-storage-0\") pod \"9a3ebe21-0432-4b40-8e80-5369de346831\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.262064 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/9a3ebe21-0432-4b40-8e80-5369de346831-kube-api-access-8pvbh\") pod \"9a3ebe21-0432-4b40-8e80-5369de346831\" (UID: \"9a3ebe21-0432-4b40-8e80-5369de346831\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.277597 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3ebe21-0432-4b40-8e80-5369de346831-kube-api-access-8pvbh" (OuterVolumeSpecName: "kube-api-access-8pvbh") pod "9a3ebe21-0432-4b40-8e80-5369de346831" (UID: "9a3ebe21-0432-4b40-8e80-5369de346831"). InnerVolumeSpecName "kube-api-access-8pvbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.334431 4886 generic.go:334] "Generic (PLEG): container finished" podID="9a3ebe21-0432-4b40-8e80-5369de346831" containerID="92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9" exitCode=0 Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.334555 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" event={"ID":"9a3ebe21-0432-4b40-8e80-5369de346831","Type":"ContainerDied","Data":"92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9"} Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.334589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" event={"ID":"9a3ebe21-0432-4b40-8e80-5369de346831","Type":"ContainerDied","Data":"aaf1e8077c9835a1b104157a36cf8544d0fb0152ff1f2c03566426e11c98b7f5"} Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.334626 4886 scope.go:117] "RemoveContainer" containerID="92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.334843 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-m899h" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.337002 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-config" (OuterVolumeSpecName: "config") pod "9a3ebe21-0432-4b40-8e80-5369de346831" (UID: "9a3ebe21-0432-4b40-8e80-5369de346831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.365227 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pvbh\" (UniqueName: \"kubernetes.io/projected/9a3ebe21-0432-4b40-8e80-5369de346831-kube-api-access-8pvbh\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.365265 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.391460 4886 scope.go:117] "RemoveContainer" containerID="86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.400304 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a3ebe21-0432-4b40-8e80-5369de346831" (UID: "9a3ebe21-0432-4b40-8e80-5369de346831"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.409411 4886 generic.go:334] "Generic (PLEG): container finished" podID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerID="8c743e44789d946fe248b9fb8b21d0de73cd5e05715b311be36c45268e7aad22" exitCode=0 Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.409482 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd5f87754-lf26d" event={"ID":"00edb071-fbb5-4e88-8370-a2c76ad13a6c","Type":"ContainerDied","Data":"8c743e44789d946fe248b9fb8b21d0de73cd5e05715b311be36c45268e7aad22"} Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.411811 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a3ebe21-0432-4b40-8e80-5369de346831" (UID: "9a3ebe21-0432-4b40-8e80-5369de346831"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.416923 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"cf9e5273-6d41-439e-98a2-263c64a3b39b","Type":"ContainerStarted","Data":"86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316"} Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.416979 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"cf9e5273-6d41-439e-98a2-263c64a3b39b","Type":"ContainerStarted","Data":"fbcc930d9cc58764687ece8f172f12d2771b34a57db49adb3c11e62f5d2b5723"} Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.417168 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="cinder-scheduler" containerID="cri-o://a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1" gracePeriod=30 Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.417365 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="probe" containerID="cri-o://e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9" gracePeriod=30 Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.430625 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a3ebe21-0432-4b40-8e80-5369de346831" (UID: "9a3ebe21-0432-4b40-8e80-5369de346831"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.442521 4886 scope.go:117] "RemoveContainer" containerID="92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.443536 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41275dd-03d8-40b8-9f06-0dc67ecb12e6" path="/var/lib/kubelet/pods/b41275dd-03d8-40b8-9f06-0dc67ecb12e6/volumes" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.443536 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a3ebe21-0432-4b40-8e80-5369de346831" (UID: "9a3ebe21-0432-4b40-8e80-5369de346831"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: E0314 08:50:43.444622 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9\": container with ID starting with 92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9 not found: ID does not exist" containerID="92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.444646 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9"} err="failed to get container status \"92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9\": rpc error: code = NotFound desc = could not find container \"92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9\": container with ID starting with 92840dcb3f9dd224bfb38b1476258236f280a5f4b9633eed06d3b2e17caeddc9 not found: ID does not exist" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.444664 4886 scope.go:117] "RemoveContainer" containerID="86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01" Mar 14 08:50:43 crc kubenswrapper[4886]: E0314 08:50:43.444936 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01\": container with ID starting with 86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01 not found: ID does not exist" containerID="86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.444963 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01"} err="failed to get container status \"86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01\": rpc error: code = NotFound desc = could not find container \"86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01\": container with ID starting with 86486c9f958fd6f5cf344f52aa0002d88494ee76f66bcff3448ce6ec92deec01 not found: ID does not exist" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.444970 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.444954566 podStartE2EDuration="2.444954566s" podCreationTimestamp="2026-03-14 08:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:43.442482096 +0000 UTC m=+1378.690933733" watchObservedRunningTime="2026-03-14 08:50:43.444954566 +0000 UTC m=+1378.693406203" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.466569 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.466599 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.466609 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.466617 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a3ebe21-0432-4b40-8e80-5369de346831-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.470087 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.570705 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data\") pod \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.571163 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-combined-ca-bundle\") pod \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.571334 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxtkk\" (UniqueName: \"kubernetes.io/projected/00edb071-fbb5-4e88-8370-a2c76ad13a6c-kube-api-access-mxtkk\") pod \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.571478 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data-custom\") pod \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.571578 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00edb071-fbb5-4e88-8370-a2c76ad13a6c-logs\") pod \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\" (UID: \"00edb071-fbb5-4e88-8370-a2c76ad13a6c\") " Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.584405 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00edb071-fbb5-4e88-8370-a2c76ad13a6c-kube-api-access-mxtkk" (OuterVolumeSpecName: "kube-api-access-mxtkk") pod "00edb071-fbb5-4e88-8370-a2c76ad13a6c" (UID: "00edb071-fbb5-4e88-8370-a2c76ad13a6c"). InnerVolumeSpecName "kube-api-access-mxtkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.584535 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00edb071-fbb5-4e88-8370-a2c76ad13a6c-logs" (OuterVolumeSpecName: "logs") pod "00edb071-fbb5-4e88-8370-a2c76ad13a6c" (UID: "00edb071-fbb5-4e88-8370-a2c76ad13a6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.592766 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00edb071-fbb5-4e88-8370-a2c76ad13a6c" (UID: "00edb071-fbb5-4e88-8370-a2c76ad13a6c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.617941 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00edb071-fbb5-4e88-8370-a2c76ad13a6c" (UID: "00edb071-fbb5-4e88-8370-a2c76ad13a6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.646399 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data" (OuterVolumeSpecName: "config-data") pod "00edb071-fbb5-4e88-8370-a2c76ad13a6c" (UID: "00edb071-fbb5-4e88-8370-a2c76ad13a6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.674283 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.674744 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxtkk\" (UniqueName: \"kubernetes.io/projected/00edb071-fbb5-4e88-8370-a2c76ad13a6c-kube-api-access-mxtkk\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.674762 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.674774 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00edb071-fbb5-4e88-8370-a2c76ad13a6c-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.674785 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00edb071-fbb5-4e88-8370-a2c76ad13a6c-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.678188 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-m899h"] Mar 14 08:50:43 crc kubenswrapper[4886]: I0314 08:50:43.685951 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-m899h"] Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.169645 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7769c88f5b-8gr9x" Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.262285 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66c6bc56b6-25jn4"] Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.262535 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon-log" containerID="cri-o://d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc" gracePeriod=30 Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.263054 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" containerID="cri-o://844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57" gracePeriod=30 Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.277537 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.430539 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a745438-cb17-4626-96ed-51c7de75a976" containerID="e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9" exitCode=0 Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.430597 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a745438-cb17-4626-96ed-51c7de75a976","Type":"ContainerDied","Data":"e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9"} Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.433621 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd5f87754-lf26d" Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.442198 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd5f87754-lf26d" event={"ID":"00edb071-fbb5-4e88-8370-a2c76ad13a6c","Type":"ContainerDied","Data":"f8b727d0a646738d088eb194933142dbeb13d0d1b0d4952cb79c4814e9c4193e"} Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.442244 4886 scope.go:117] "RemoveContainer" containerID="8c743e44789d946fe248b9fb8b21d0de73cd5e05715b311be36c45268e7aad22" Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.463683 4886 scope.go:117] "RemoveContainer" containerID="33ef06fc60d6075241b9c87a3f787f8ce5d96bfd1cd8bd009555ebed64f9a14e" Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.475907 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fd5f87754-lf26d"] Mar 14 08:50:44 crc kubenswrapper[4886]: I0314 08:50:44.492181 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6fd5f87754-lf26d"] Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.127290 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.207732 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data\") pod \"7a745438-cb17-4626-96ed-51c7de75a976\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.207781 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sgn9\" (UniqueName: \"kubernetes.io/projected/7a745438-cb17-4626-96ed-51c7de75a976-kube-api-access-2sgn9\") pod \"7a745438-cb17-4626-96ed-51c7de75a976\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.207848 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-combined-ca-bundle\") pod \"7a745438-cb17-4626-96ed-51c7de75a976\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.207966 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a745438-cb17-4626-96ed-51c7de75a976-etc-machine-id\") pod \"7a745438-cb17-4626-96ed-51c7de75a976\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.208056 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data-custom\") pod \"7a745438-cb17-4626-96ed-51c7de75a976\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.208109 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-scripts\") pod \"7a745438-cb17-4626-96ed-51c7de75a976\" (UID: \"7a745438-cb17-4626-96ed-51c7de75a976\") " Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.208130 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a745438-cb17-4626-96ed-51c7de75a976-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a745438-cb17-4626-96ed-51c7de75a976" (UID: "7a745438-cb17-4626-96ed-51c7de75a976"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.208590 4886 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a745438-cb17-4626-96ed-51c7de75a976-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.214910 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-scripts" (OuterVolumeSpecName: "scripts") pod "7a745438-cb17-4626-96ed-51c7de75a976" (UID: "7a745438-cb17-4626-96ed-51c7de75a976"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.218357 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a745438-cb17-4626-96ed-51c7de75a976" (UID: "7a745438-cb17-4626-96ed-51c7de75a976"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.219474 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a745438-cb17-4626-96ed-51c7de75a976-kube-api-access-2sgn9" (OuterVolumeSpecName: "kube-api-access-2sgn9") pod "7a745438-cb17-4626-96ed-51c7de75a976" (UID: "7a745438-cb17-4626-96ed-51c7de75a976"). InnerVolumeSpecName "kube-api-access-2sgn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.273609 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a745438-cb17-4626-96ed-51c7de75a976" (UID: "7a745438-cb17-4626-96ed-51c7de75a976"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.310152 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sgn9\" (UniqueName: \"kubernetes.io/projected/7a745438-cb17-4626-96ed-51c7de75a976-kube-api-access-2sgn9\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.310922 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.311114 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.311260 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.311136 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data" (OuterVolumeSpecName: "config-data") pod "7a745438-cb17-4626-96ed-51c7de75a976" (UID: "7a745438-cb17-4626-96ed-51c7de75a976"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.413294 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a745438-cb17-4626-96ed-51c7de75a976-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.437907 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" path="/var/lib/kubelet/pods/00edb071-fbb5-4e88-8370-a2c76ad13a6c/volumes" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.439135 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" path="/var/lib/kubelet/pods/9a3ebe21-0432-4b40-8e80-5369de346831/volumes" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.443043 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a745438-cb17-4626-96ed-51c7de75a976" containerID="a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1" exitCode=0 Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.443108 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.443113 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a745438-cb17-4626-96ed-51c7de75a976","Type":"ContainerDied","Data":"a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1"} Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.443278 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a745438-cb17-4626-96ed-51c7de75a976","Type":"ContainerDied","Data":"34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf"} Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.443301 4886 scope.go:117] "RemoveContainer" containerID="e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.469411 4886 scope.go:117] "RemoveContainer" containerID="a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.494601 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.521339 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.528753 4886 scope.go:117] "RemoveContainer" containerID="e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.529392 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9\": container with ID starting with e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9 not found: ID does not exist" containerID="e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.529506 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9"} err="failed to get container status \"e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9\": rpc error: code = NotFound desc = could not find container \"e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9\": container with ID starting with e833dbe9b3ae7596bf3e91ad7f9d57b44d6d40bced33f3c4bcc37011132b92f9 not found: ID does not exist" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.529599 4886 scope.go:117] "RemoveContainer" containerID="a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.529894 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1\": container with ID starting with a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1 not found: ID does not exist" containerID="a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.529972 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1"} err="failed to get container status \"a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1\": rpc error: code = NotFound desc = could not find container \"a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1\": container with ID starting with a79d7e08fb9701975b941f1adf5799da7d4f33f7174b85754097a714859e03c1 not found: ID does not exist" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.591829 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.592430 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" containerName="dnsmasq-dns" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592448 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" containerName="dnsmasq-dns" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.592471 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="probe" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592477 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="probe" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.592511 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" containerName="init" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592518 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" containerName="init" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.592527 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="cinder-scheduler" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592533 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="cinder-scheduler" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.592549 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592555 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api" Mar 14 08:50:45 crc kubenswrapper[4886]: E0314 08:50:45.592590 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api-log" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592598 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api-log" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592771 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api-log" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592788 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="cinder-scheduler" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592797 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a745438-cb17-4626-96ed-51c7de75a976" containerName="probe" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592803 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="00edb071-fbb5-4e88-8370-a2c76ad13a6c" containerName="barbican-api" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.592813 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3ebe21-0432-4b40-8e80-5369de346831" containerName="dnsmasq-dns" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.593845 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.596013 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.613686 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.728254 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.728346 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.728366 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d37af9-6a33-4358-9c0c-258cb011d3e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.728412 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.728440 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.728469 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rkn\" (UniqueName: \"kubernetes.io/projected/b1d37af9-6a33-4358-9c0c-258cb011d3e4-kube-api-access-v2rkn\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830211 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830275 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830294 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d37af9-6a33-4358-9c0c-258cb011d3e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830318 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830342 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830359 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rkn\" (UniqueName: \"kubernetes.io/projected/b1d37af9-6a33-4358-9c0c-258cb011d3e4-kube-api-access-v2rkn\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.830549 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d37af9-6a33-4358-9c0c-258cb011d3e4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.833154 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.835934 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.835999 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.836935 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.845608 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d37af9-6a33-4358-9c0c-258cb011d3e4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.848426 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rkn\" (UniqueName: \"kubernetes.io/projected/b1d37af9-6a33-4358-9c0c-258cb011d3e4-kube-api-access-v2rkn\") pod \"cinder-scheduler-0\" (UID: \"b1d37af9-6a33-4358-9c0c-258cb011d3e4\") " pod="openstack/cinder-scheduler-0" Mar 14 08:50:45 crc kubenswrapper[4886]: I0314 08:50:45.914264 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.338773 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57dfd898bd-kzdvs" Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.351520 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.432182 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.452390 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-869f94f9b-5hmcl"] Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.482567 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d37af9-6a33-4358-9c0c-258cb011d3e4","Type":"ContainerStarted","Data":"bf222c8cc2561fb5492d307241f0d98137b17b1507602e0dae45b3e54ef52278"} Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.482748 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-869f94f9b-5hmcl" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-log" containerID="cri-o://8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554" gracePeriod=30 Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.482873 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-869f94f9b-5hmcl" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-api" containerID="cri-o://b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2" gracePeriod=30 Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.498425 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-869f94f9b-5hmcl" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.184:8778/\": EOF" Mar 14 08:50:46 crc kubenswrapper[4886]: I0314 08:50:46.498852 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/placement-869f94f9b-5hmcl" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.184:8778/\": EOF" Mar 14 08:50:47 crc kubenswrapper[4886]: I0314 08:50:47.440735 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a745438-cb17-4626-96ed-51c7de75a976" path="/var/lib/kubelet/pods/7a745438-cb17-4626-96ed-51c7de75a976/volumes" Mar 14 08:50:47 crc kubenswrapper[4886]: I0314 08:50:47.496917 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d37af9-6a33-4358-9c0c-258cb011d3e4","Type":"ContainerStarted","Data":"4db0d6a3ecc11544b175ba3c1438d7bf35031947b7fd5206ccc00aadcc3f1914"} Mar 14 08:50:47 crc kubenswrapper[4886]: I0314 08:50:47.499176 4886 generic.go:334] "Generic (PLEG): container finished" podID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerID="8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554" exitCode=143 Mar 14 08:50:47 crc kubenswrapper[4886]: I0314 08:50:47.499258 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-869f94f9b-5hmcl" event={"ID":"78d6b750-e5ce-4784-a7ee-3930cb52b4c1","Type":"ContainerDied","Data":"8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554"} Mar 14 08:50:47 crc kubenswrapper[4886]: I0314 08:50:47.679827 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:44380->10.217.0.168:8443: read: connection reset by peer" Mar 14 08:50:48 crc kubenswrapper[4886]: I0314 08:50:48.019834 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d5d4f7d47-v8h75" Mar 14 08:50:48 crc kubenswrapper[4886]: I0314 08:50:48.146269 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Mar 14 08:50:48 crc kubenswrapper[4886]: I0314 08:50:48.527069 4886 generic.go:334] "Generic (PLEG): container finished" podID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerID="844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57" exitCode=0 Mar 14 08:50:48 crc kubenswrapper[4886]: I0314 08:50:48.527563 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c6bc56b6-25jn4" event={"ID":"3f8100ac-c606-4eb3-afd6-07be9de44f42","Type":"ContainerDied","Data":"844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57"} Mar 14 08:50:48 crc kubenswrapper[4886]: I0314 08:50:48.536410 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d37af9-6a33-4358-9c0c-258cb011d3e4","Type":"ContainerStarted","Data":"74272fd42d5c9a2c9e0ad06b7be550f6ca37c612aa2330ddb49c5d90dae71217"} Mar 14 08:50:48 crc kubenswrapper[4886]: I0314 08:50:48.571779 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.571742655 podStartE2EDuration="3.571742655s" podCreationTimestamp="2026-03-14 08:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:50:48.561128984 +0000 UTC m=+1383.809580621" watchObservedRunningTime="2026-03-14 08:50:48.571742655 +0000 UTC m=+1383.820194292" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.532005 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.533214 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.535723 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-28wgq" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.535956 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.536106 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.556390 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.613896 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65b2ef9b-5c95-4936-8c8c-2abec26e2595-openstack-config\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.613992 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2ef9b-5c95-4936-8c8c-2abec26e2595-combined-ca-bundle\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.614047 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p896\" (UniqueName: \"kubernetes.io/projected/65b2ef9b-5c95-4936-8c8c-2abec26e2595-kube-api-access-7p896\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.614146 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65b2ef9b-5c95-4936-8c8c-2abec26e2595-openstack-config-secret\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.715841 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65b2ef9b-5c95-4936-8c8c-2abec26e2595-openstack-config-secret\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.716035 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65b2ef9b-5c95-4936-8c8c-2abec26e2595-openstack-config\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.716093 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2ef9b-5c95-4936-8c8c-2abec26e2595-combined-ca-bundle\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.716150 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p896\" (UniqueName: \"kubernetes.io/projected/65b2ef9b-5c95-4936-8c8c-2abec26e2595-kube-api-access-7p896\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.718776 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65b2ef9b-5c95-4936-8c8c-2abec26e2595-openstack-config\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.722681 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65b2ef9b-5c95-4936-8c8c-2abec26e2595-openstack-config-secret\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.737664 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b2ef9b-5c95-4936-8c8c-2abec26e2595-combined-ca-bundle\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.737876 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p896\" (UniqueName: \"kubernetes.io/projected/65b2ef9b-5c95-4936-8c8c-2abec26e2595-kube-api-access-7p896\") pod \"openstackclient\" (UID: \"65b2ef9b-5c95-4936-8c8c-2abec26e2595\") " pod="openstack/openstackclient" Mar 14 08:50:49 crc kubenswrapper[4886]: I0314 08:50:49.874373 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 14 08:50:50 crc kubenswrapper[4886]: I0314 08:50:50.364446 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 14 08:50:50 crc kubenswrapper[4886]: I0314 08:50:50.554813 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"65b2ef9b-5c95-4936-8c8c-2abec26e2595","Type":"ContainerStarted","Data":"7facf092db6e5923ff5a250f2a0184a8691209a20a95be7c0b9c057ebf6b2e30"} Mar 14 08:50:50 crc kubenswrapper[4886]: I0314 08:50:50.915529 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.303502 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.354832 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-internal-tls-certs\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.354877 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-logs\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.354912 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-config-data\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.354937 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-scripts\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.354962 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-public-tls-certs\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.354998 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-combined-ca-bundle\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.355048 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pf46\" (UniqueName: \"kubernetes.io/projected/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-kube-api-access-4pf46\") pod \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\" (UID: \"78d6b750-e5ce-4784-a7ee-3930cb52b4c1\") " Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.357439 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-logs" (OuterVolumeSpecName: "logs") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.364335 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-scripts" (OuterVolumeSpecName: "scripts") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.380345 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-kube-api-access-4pf46" (OuterVolumeSpecName: "kube-api-access-4pf46") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "kube-api-access-4pf46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.413160 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-config-data" (OuterVolumeSpecName: "config-data") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.418462 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.457983 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.458009 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.458019 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.458028 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.458038 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pf46\" (UniqueName: \"kubernetes.io/projected/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-kube-api-access-4pf46\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.472294 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.516888 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "78d6b750-e5ce-4784-a7ee-3930cb52b4c1" (UID: "78d6b750-e5ce-4784-a7ee-3930cb52b4c1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.560427 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.560457 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78d6b750-e5ce-4784-a7ee-3930cb52b4c1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.567890 4886 generic.go:334] "Generic (PLEG): container finished" podID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerID="b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2" exitCode=0 Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.567934 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-869f94f9b-5hmcl" event={"ID":"78d6b750-e5ce-4784-a7ee-3930cb52b4c1","Type":"ContainerDied","Data":"b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2"} Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.567947 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-869f94f9b-5hmcl" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.567976 4886 scope.go:117] "RemoveContainer" containerID="b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.567965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-869f94f9b-5hmcl" event={"ID":"78d6b750-e5ce-4784-a7ee-3930cb52b4c1","Type":"ContainerDied","Data":"0a6f9ad3b5f99b8c68d28bad86eead224469b4e9a36507cb66ea9decb77ecefb"} Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.612344 4886 scope.go:117] "RemoveContainer" containerID="8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.620075 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-869f94f9b-5hmcl"] Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.632730 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-869f94f9b-5hmcl"] Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.639399 4886 scope.go:117] "RemoveContainer" containerID="b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2" Mar 14 08:50:51 crc kubenswrapper[4886]: E0314 08:50:51.639890 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2\": container with ID starting with b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2 not found: ID does not exist" containerID="b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.639928 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2"} err="failed to get container status \"b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2\": rpc error: code = NotFound desc = could not find container \"b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2\": container with ID starting with b03b475543435b63f8d37a03f85618180c40d36cb07385b6de5660bbd94b70e2 not found: ID does not exist" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.639954 4886 scope.go:117] "RemoveContainer" containerID="8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554" Mar 14 08:50:51 crc kubenswrapper[4886]: E0314 08:50:51.640376 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554\": container with ID starting with 8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554 not found: ID does not exist" containerID="8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.640421 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554"} err="failed to get container status \"8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554\": rpc error: code = NotFound desc = could not find container \"8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554\": container with ID starting with 8f4af92f07e0b697cc23024f635bdf83aa24799e68f75e26a2e5351c5f6bd554 not found: ID does not exist" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.766571 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 14 08:50:51 crc kubenswrapper[4886]: I0314 08:50:51.975939 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:52 crc kubenswrapper[4886]: I0314 08:50:52.004086 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:52 crc kubenswrapper[4886]: I0314 08:50:52.578453 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:52 crc kubenswrapper[4886]: I0314 08:50:52.623281 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 14 08:50:53 crc kubenswrapper[4886]: I0314 08:50:53.436696 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" path="/var/lib/kubelet/pods/78d6b750-e5ce-4784-a7ee-3930cb52b4c1/volumes" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.174940 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-768658d555-ttc2f"] Mar 14 08:50:54 crc kubenswrapper[4886]: E0314 08:50:54.175671 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-api" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.175689 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-api" Mar 14 08:50:54 crc kubenswrapper[4886]: E0314 08:50:54.175712 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-log" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.175720 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-log" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.175888 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-api" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.175916 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d6b750-e5ce-4784-a7ee-3930cb52b4c1" containerName="placement-log" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.176933 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.178933 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.179831 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.180447 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.185808 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-768658d555-ttc2f"] Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.212701 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-config-data\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.212766 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/69b57a2a-532a-4728-89d6-090f17edc7a7-etc-swift\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.213063 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tswx\" (UniqueName: \"kubernetes.io/projected/69b57a2a-532a-4728-89d6-090f17edc7a7-kube-api-access-7tswx\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.213237 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b57a2a-532a-4728-89d6-090f17edc7a7-log-httpd\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.213360 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-internal-tls-certs\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.213554 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-combined-ca-bundle\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.213756 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b57a2a-532a-4728-89d6-090f17edc7a7-run-httpd\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.213831 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-public-tls-certs\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b57a2a-532a-4728-89d6-090f17edc7a7-run-httpd\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315490 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-public-tls-certs\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315522 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-config-data\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315560 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/69b57a2a-532a-4728-89d6-090f17edc7a7-etc-swift\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315654 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tswx\" (UniqueName: \"kubernetes.io/projected/69b57a2a-532a-4728-89d6-090f17edc7a7-kube-api-access-7tswx\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315713 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b57a2a-532a-4728-89d6-090f17edc7a7-log-httpd\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315770 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-internal-tls-certs\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.315822 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-combined-ca-bundle\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.317667 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b57a2a-532a-4728-89d6-090f17edc7a7-log-httpd\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.318097 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69b57a2a-532a-4728-89d6-090f17edc7a7-run-httpd\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.322212 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-combined-ca-bundle\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.323235 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/69b57a2a-532a-4728-89d6-090f17edc7a7-etc-swift\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.323343 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-config-data\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.324317 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-internal-tls-certs\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.331992 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b57a2a-532a-4728-89d6-090f17edc7a7-public-tls-certs\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.334511 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tswx\" (UniqueName: \"kubernetes.io/projected/69b57a2a-532a-4728-89d6-090f17edc7a7-kube-api-access-7tswx\") pod \"swift-proxy-768658d555-ttc2f\" (UID: \"69b57a2a-532a-4728-89d6-090f17edc7a7\") " pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:54 crc kubenswrapper[4886]: I0314 08:50:54.504767 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.229385 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.818003 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.818671 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-central-agent" containerID="cri-o://dd5387f87bdc471975f7312c9c25f8508cbf38d02865eb3750cdce2f2673637d" gracePeriod=30 Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.818769 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="proxy-httpd" containerID="cri-o://d1b23c5bf8ac377569ba717f02febd51f3743e72ecf41af7708420755eb691cd" gracePeriod=30 Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.818753 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="sg-core" containerID="cri-o://0cca71b50bf4f2e2ecb190fb38e893cf978a12c89e0e3f682d31807923123447" gracePeriod=30 Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.818770 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-notification-agent" containerID="cri-o://c16075b44bb57be8e34f305e8246a2bec54c6a3ff58abe943614ba44ba38929c" gracePeriod=30 Mar 14 08:50:56 crc kubenswrapper[4886]: I0314 08:50:56.829037 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.195:3000/\": EOF" Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626602 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerID="d1b23c5bf8ac377569ba717f02febd51f3743e72ecf41af7708420755eb691cd" exitCode=0 Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626632 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerID="0cca71b50bf4f2e2ecb190fb38e893cf978a12c89e0e3f682d31807923123447" exitCode=2 Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626641 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerID="c16075b44bb57be8e34f305e8246a2bec54c6a3ff58abe943614ba44ba38929c" exitCode=0 Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626648 4886 generic.go:334] "Generic (PLEG): container finished" podID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerID="dd5387f87bdc471975f7312c9c25f8508cbf38d02865eb3750cdce2f2673637d" exitCode=0 Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626667 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerDied","Data":"d1b23c5bf8ac377569ba717f02febd51f3743e72ecf41af7708420755eb691cd"} Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626692 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerDied","Data":"0cca71b50bf4f2e2ecb190fb38e893cf978a12c89e0e3f682d31807923123447"} Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626701 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerDied","Data":"c16075b44bb57be8e34f305e8246a2bec54c6a3ff58abe943614ba44ba38929c"} Mar 14 08:50:57 crc kubenswrapper[4886]: I0314 08:50:57.626712 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerDied","Data":"dd5387f87bdc471975f7312c9c25f8508cbf38d02865eb3750cdce2f2673637d"} Mar 14 08:50:58 crc kubenswrapper[4886]: I0314 08:50:58.147243 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Mar 14 08:50:59 crc kubenswrapper[4886]: I0314 08:50:59.973461 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.045867 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-scripts\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.045950 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-sg-core-conf-yaml\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.045989 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-config-data\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.046099 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxtdt\" (UniqueName: \"kubernetes.io/projected/3e151f3d-c603-4f97-92ba-a079b9a5ad49-kube-api-access-kxtdt\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.047271 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-run-httpd\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.047370 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-combined-ca-bundle\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.047406 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-log-httpd\") pod \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\" (UID: \"3e151f3d-c603-4f97-92ba-a079b9a5ad49\") " Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.047568 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.048081 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.048283 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.048298 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e151f3d-c603-4f97-92ba-a079b9a5ad49-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.050574 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-scripts" (OuterVolumeSpecName: "scripts") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.050674 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e151f3d-c603-4f97-92ba-a079b9a5ad49-kube-api-access-kxtdt" (OuterVolumeSpecName: "kube-api-access-kxtdt") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "kube-api-access-kxtdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.075430 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.124502 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.148360 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-config-data" (OuterVolumeSpecName: "config-data") pod "3e151f3d-c603-4f97-92ba-a079b9a5ad49" (UID: "3e151f3d-c603-4f97-92ba-a079b9a5ad49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.149749 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.149776 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.149790 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.149804 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxtdt\" (UniqueName: \"kubernetes.io/projected/3e151f3d-c603-4f97-92ba-a079b9a5ad49-kube-api-access-kxtdt\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.149815 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e151f3d-c603-4f97-92ba-a079b9a5ad49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.245149 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-768658d555-ttc2f"] Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.664721 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-768658d555-ttc2f" event={"ID":"69b57a2a-532a-4728-89d6-090f17edc7a7","Type":"ContainerStarted","Data":"dea457a9bf72d1d531ab5df47e13b17dfd45be5f3fdd6d7c91af14aeaa5a490b"} Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.665084 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-768658d555-ttc2f" event={"ID":"69b57a2a-532a-4728-89d6-090f17edc7a7","Type":"ContainerStarted","Data":"878f39c36f9606abf0fefb6168c2b483d0330bca9b25709fc8a325f778f18dd4"} Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.665104 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-768658d555-ttc2f" event={"ID":"69b57a2a-532a-4728-89d6-090f17edc7a7","Type":"ContainerStarted","Data":"9c3975cbb3240abde3145f9cc4cb1186b7a9932549f536cd66c5080bb5d8b481"} Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.665533 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.665561 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.670115 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.670112 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e151f3d-c603-4f97-92ba-a079b9a5ad49","Type":"ContainerDied","Data":"45bd4230080c0567cadfbe0d0d8a34fc639ae001052947c532822a38c13c70bc"} Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.670254 4886 scope.go:117] "RemoveContainer" containerID="d1b23c5bf8ac377569ba717f02febd51f3743e72ecf41af7708420755eb691cd" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.679698 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"65b2ef9b-5c95-4936-8c8c-2abec26e2595","Type":"ContainerStarted","Data":"9461d8015c765853c8d2e69a6e00f374edc70e2e92894d7e737cdbff23e37db6"} Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.696763 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-768658d555-ttc2f" podStartSLOduration=6.696739407 podStartE2EDuration="6.696739407s" podCreationTimestamp="2026-03-14 08:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:00.687058952 +0000 UTC m=+1395.935510589" watchObservedRunningTime="2026-03-14 08:51:00.696739407 +0000 UTC m=+1395.945191044" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.701032 4886 scope.go:117] "RemoveContainer" containerID="0cca71b50bf4f2e2ecb190fb38e893cf978a12c89e0e3f682d31807923123447" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.705302 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.414954465 podStartE2EDuration="11.705271039s" podCreationTimestamp="2026-03-14 08:50:49 +0000 UTC" firstStartedPulling="2026-03-14 08:50:50.365323856 +0000 UTC m=+1385.613775503" lastFinishedPulling="2026-03-14 08:50:59.65564044 +0000 UTC m=+1394.904092077" observedRunningTime="2026-03-14 08:51:00.703715625 +0000 UTC m=+1395.952167272" watchObservedRunningTime="2026-03-14 08:51:00.705271039 +0000 UTC m=+1395.953722676" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.732357 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.751485 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.754621 4886 scope.go:117] "RemoveContainer" containerID="c16075b44bb57be8e34f305e8246a2bec54c6a3ff58abe943614ba44ba38929c" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.764501 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:00 crc kubenswrapper[4886]: E0314 08:51:00.764968 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-notification-agent" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.764986 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-notification-agent" Mar 14 08:51:00 crc kubenswrapper[4886]: E0314 08:51:00.765019 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="proxy-httpd" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765028 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="proxy-httpd" Mar 14 08:51:00 crc kubenswrapper[4886]: E0314 08:51:00.765040 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="sg-core" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765049 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="sg-core" Mar 14 08:51:00 crc kubenswrapper[4886]: E0314 08:51:00.765060 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-central-agent" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765067 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-central-agent" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765407 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="sg-core" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765438 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-notification-agent" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765449 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="ceilometer-central-agent" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.765457 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" containerName="proxy-httpd" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.767193 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.771203 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.771640 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.777057 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.787190 4886 scope.go:117] "RemoveContainer" containerID="dd5387f87bdc471975f7312c9c25f8508cbf38d02865eb3750cdce2f2673637d" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872587 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-run-httpd\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872655 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872695 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872714 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p69rh\" (UniqueName: \"kubernetes.io/projected/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-kube-api-access-p69rh\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872851 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-config-data\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872871 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-scripts\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.872904 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-log-httpd\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.974895 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-config-data\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.974940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-scripts\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.974980 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-log-httpd\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.975017 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-run-httpd\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.975047 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.975606 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-log-httpd\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.975698 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-run-httpd\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.975080 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.975775 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p69rh\" (UniqueName: \"kubernetes.io/projected/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-kube-api-access-p69rh\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.982014 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.983058 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-config-data\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.983093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-scripts\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.988015 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:00 crc kubenswrapper[4886]: I0314 08:51:00.995165 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p69rh\" (UniqueName: \"kubernetes.io/projected/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-kube-api-access-p69rh\") pod \"ceilometer-0\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " pod="openstack/ceilometer-0" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.131518 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.387898 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xwt5m"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.389775 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.399905 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xwt5m"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.470852 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e151f3d-c603-4f97-92ba-a079b9a5ad49" path="/var/lib/kubelet/pods/3e151f3d-c603-4f97-92ba-a079b9a5ad49/volumes" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.477960 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-w8c9q"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.479316 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.483414 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8303-account-create-update-w2kk2"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.484568 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.487692 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.495179 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8303-account-create-update-w2kk2"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.499281 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4wd\" (UniqueName: \"kubernetes.io/projected/c7dae5e3-c486-4f68-bc83-54cda54cd52b-kube-api-access-gm4wd\") pod \"nova-api-db-create-xwt5m\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.500371 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7dae5e3-c486-4f68-bc83-54cda54cd52b-operator-scripts\") pod \"nova-api-db-create-xwt5m\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.504156 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-w8c9q"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.515262 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5694fd5cb9-r8nzz" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.605615 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4wd\" (UniqueName: \"kubernetes.io/projected/c7dae5e3-c486-4f68-bc83-54cda54cd52b-kube-api-access-gm4wd\") pod \"nova-api-db-create-xwt5m\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.605701 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639641ff-f838-43ec-bc7a-c8313a5dc254-operator-scripts\") pod \"nova-cell0-db-create-w8c9q\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.605767 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h9hz\" (UniqueName: \"kubernetes.io/projected/639641ff-f838-43ec-bc7a-c8313a5dc254-kube-api-access-8h9hz\") pod \"nova-cell0-db-create-w8c9q\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.605902 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-operator-scripts\") pod \"nova-api-8303-account-create-update-w2kk2\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.605944 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4kfr\" (UniqueName: \"kubernetes.io/projected/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-kube-api-access-g4kfr\") pod \"nova-api-8303-account-create-update-w2kk2\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.606015 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7dae5e3-c486-4f68-bc83-54cda54cd52b-operator-scripts\") pod \"nova-api-db-create-xwt5m\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.607792 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7dae5e3-c486-4f68-bc83-54cda54cd52b-operator-scripts\") pod \"nova-api-db-create-xwt5m\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.656224 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64796d9fb-7nw8p"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.656480 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64796d9fb-7nw8p" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-api" containerID="cri-o://73d43bea51b4f69093e0094852fed6e07a712fa8984fafd978f07dad7e19c2f5" gracePeriod=30 Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.656900 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64796d9fb-7nw8p" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-httpd" containerID="cri-o://3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a" gracePeriod=30 Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.670572 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4wd\" (UniqueName: \"kubernetes.io/projected/c7dae5e3-c486-4f68-bc83-54cda54cd52b-kube-api-access-gm4wd\") pod \"nova-api-db-create-xwt5m\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.714464 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639641ff-f838-43ec-bc7a-c8313a5dc254-operator-scripts\") pod \"nova-cell0-db-create-w8c9q\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.714536 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h9hz\" (UniqueName: \"kubernetes.io/projected/639641ff-f838-43ec-bc7a-c8313a5dc254-kube-api-access-8h9hz\") pod \"nova-cell0-db-create-w8c9q\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.714630 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-operator-scripts\") pod \"nova-api-8303-account-create-update-w2kk2\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.714661 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4kfr\" (UniqueName: \"kubernetes.io/projected/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-kube-api-access-g4kfr\") pod \"nova-api-8303-account-create-update-w2kk2\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.715753 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639641ff-f838-43ec-bc7a-c8313a5dc254-operator-scripts\") pod \"nova-cell0-db-create-w8c9q\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.715845 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-operator-scripts\") pod \"nova-api-8303-account-create-update-w2kk2\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.715965 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.717769 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-dwk2t"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.718946 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.754253 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dwk2t"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.801371 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.820611 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h9hz\" (UniqueName: \"kubernetes.io/projected/639641ff-f838-43ec-bc7a-c8313a5dc254-kube-api-access-8h9hz\") pod \"nova-cell0-db-create-w8c9q\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.821041 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0858eab0-1359-49c7-89eb-fe94498572dc-operator-scripts\") pod \"nova-cell1-db-create-dwk2t\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.821239 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nklb2\" (UniqueName: \"kubernetes.io/projected/0858eab0-1359-49c7-89eb-fe94498572dc-kube-api-access-nklb2\") pod \"nova-cell1-db-create-dwk2t\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.855550 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4kfr\" (UniqueName: \"kubernetes.io/projected/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-kube-api-access-g4kfr\") pod \"nova-api-8303-account-create-update-w2kk2\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.863792 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-113e-account-create-update-mjr4j"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.873335 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.883810 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.915759 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-113e-account-create-update-mjr4j"] Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.925607 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74zhh\" (UniqueName: \"kubernetes.io/projected/f21173e4-e8f0-46b1-8c84-0453259409aa-kube-api-access-74zhh\") pod \"nova-cell0-113e-account-create-update-mjr4j\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.925660 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nklb2\" (UniqueName: \"kubernetes.io/projected/0858eab0-1359-49c7-89eb-fe94498572dc-kube-api-access-nklb2\") pod \"nova-cell1-db-create-dwk2t\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.925699 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f21173e4-e8f0-46b1-8c84-0453259409aa-operator-scripts\") pod \"nova-cell0-113e-account-create-update-mjr4j\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.925773 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0858eab0-1359-49c7-89eb-fe94498572dc-operator-scripts\") pod \"nova-cell1-db-create-dwk2t\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.926601 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0858eab0-1359-49c7-89eb-fe94498572dc-operator-scripts\") pod \"nova-cell1-db-create-dwk2t\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:01 crc kubenswrapper[4886]: I0314 08:51:01.948940 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nklb2\" (UniqueName: \"kubernetes.io/projected/0858eab0-1359-49c7-89eb-fe94498572dc-kube-api-access-nklb2\") pod \"nova-cell1-db-create-dwk2t\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.022950 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4e72-account-create-update-tpngn"] Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.024511 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.027423 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74zhh\" (UniqueName: \"kubernetes.io/projected/f21173e4-e8f0-46b1-8c84-0453259409aa-kube-api-access-74zhh\") pod \"nova-cell0-113e-account-create-update-mjr4j\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.027608 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f21173e4-e8f0-46b1-8c84-0453259409aa-operator-scripts\") pod \"nova-cell0-113e-account-create-update-mjr4j\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.028347 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f21173e4-e8f0-46b1-8c84-0453259409aa-operator-scripts\") pod \"nova-cell0-113e-account-create-update-mjr4j\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.036588 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.051873 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4e72-account-create-update-tpngn"] Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.076609 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74zhh\" (UniqueName: \"kubernetes.io/projected/f21173e4-e8f0-46b1-8c84-0453259409aa-kube-api-access-74zhh\") pod \"nova-cell0-113e-account-create-update-mjr4j\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.116797 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.129443 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7859\" (UniqueName: \"kubernetes.io/projected/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-kube-api-access-w7859\") pod \"nova-cell1-4e72-account-create-update-tpngn\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.129532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-operator-scripts\") pod \"nova-cell1-4e72-account-create-update-tpngn\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.133536 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.231644 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-operator-scripts\") pod \"nova-cell1-4e72-account-create-update-tpngn\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.231824 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7859\" (UniqueName: \"kubernetes.io/projected/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-kube-api-access-w7859\") pod \"nova-cell1-4e72-account-create-update-tpngn\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.232529 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-operator-scripts\") pod \"nova-cell1-4e72-account-create-update-tpngn\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.237824 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.251366 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7859\" (UniqueName: \"kubernetes.io/projected/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-kube-api-access-w7859\") pod \"nova-cell1-4e72-account-create-update-tpngn\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.292678 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.388683 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.508615 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xwt5m"] Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.694653 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-w8c9q"] Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.781334 4886 generic.go:334] "Generic (PLEG): container finished" podID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerID="3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a" exitCode=0 Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.781418 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64796d9fb-7nw8p" event={"ID":"4a9ffec0-3aa9-46a6-87b9-aadc1021683c","Type":"ContainerDied","Data":"3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a"} Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.812334 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xwt5m" event={"ID":"c7dae5e3-c486-4f68-bc83-54cda54cd52b","Type":"ContainerStarted","Data":"f7306286d32f35bcba30fd65b121a59ce644913729614db5e169f5044fc4bcdd"} Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.821947 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerStarted","Data":"dbb7d6aeebbaf30259bc6e6be0e95910c7355d8fe7cc034d674c50445ecb5669"} Mar 14 08:51:02 crc kubenswrapper[4886]: I0314 08:51:02.866037 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8303-account-create-update-w2kk2"] Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.204503 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-dwk2t"] Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.311709 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-113e-account-create-update-mjr4j"] Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.350367 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4e72-account-create-update-tpngn"] Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.848555 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.872381 4886 generic.go:334] "Generic (PLEG): container finished" podID="639641ff-f838-43ec-bc7a-c8313a5dc254" containerID="b55b07edfc3c6db8f494ffa43bc412c5c283cecccaf42e59e5c1dfee3f60f580" exitCode=0 Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.872492 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w8c9q" event={"ID":"639641ff-f838-43ec-bc7a-c8313a5dc254","Type":"ContainerDied","Data":"b55b07edfc3c6db8f494ffa43bc412c5c283cecccaf42e59e5c1dfee3f60f580"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.872522 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w8c9q" event={"ID":"639641ff-f838-43ec-bc7a-c8313a5dc254","Type":"ContainerStarted","Data":"a684a52cab07b2b6f6317c2a4f1db8a5e8b72460da9285d8c762ddcccf66c987"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.886563 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" event={"ID":"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a","Type":"ContainerStarted","Data":"05b71a3caa442f9cde91a7c9fa6129c19bb28eae6c5f2f5f65b9df2b28dbcb00"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.886638 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" event={"ID":"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a","Type":"ContainerStarted","Data":"473fead87b2dfb316b1d14ef8d1a650ee3577d6ae3b1adec25f3e3aff2d8d3c6"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.891886 4886 generic.go:334] "Generic (PLEG): container finished" podID="c7dae5e3-c486-4f68-bc83-54cda54cd52b" containerID="9850ec02f44acd2a8a2e5d333cff22d43a48ee54579e2e2cdfb4a18bb113bd22" exitCode=0 Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.891959 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xwt5m" event={"ID":"c7dae5e3-c486-4f68-bc83-54cda54cd52b","Type":"ContainerDied","Data":"9850ec02f44acd2a8a2e5d333cff22d43a48ee54579e2e2cdfb4a18bb113bd22"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.903227 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerStarted","Data":"41e68ca14d6b71ce8564e3e7f5dd30016b98c576a9aee808aa822604150b0e1f"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.910841 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" event={"ID":"f21173e4-e8f0-46b1-8c84-0453259409aa","Type":"ContainerStarted","Data":"d940b6b18724b28bd807f13bd9359d9b3e231166274fcedb906e68db0afdf259"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.910886 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" event={"ID":"f21173e4-e8f0-46b1-8c84-0453259409aa","Type":"ContainerStarted","Data":"a171e30da288f6d0d664a3544b55ea4a60540d28fbbd3eca034c0310b0ef8523"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.922286 4886 generic.go:334] "Generic (PLEG): container finished" podID="874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" containerID="b6c76fd658a8b180e27bf1c63d1b42a0ed94a21db69a0aeaf7712c59dc01a4f6" exitCode=0 Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.922366 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8303-account-create-update-w2kk2" event={"ID":"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38","Type":"ContainerDied","Data":"b6c76fd658a8b180e27bf1c63d1b42a0ed94a21db69a0aeaf7712c59dc01a4f6"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.922392 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8303-account-create-update-w2kk2" event={"ID":"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38","Type":"ContainerStarted","Data":"6df2bf6c8bbe971842d8114e2b7448a52e709f8603da402fca0e36e5c77456f3"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.924603 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" podStartSLOduration=2.924591076 podStartE2EDuration="2.924591076s" podCreationTimestamp="2026-03-14 08:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:03.909329913 +0000 UTC m=+1399.157781550" watchObservedRunningTime="2026-03-14 08:51:03.924591076 +0000 UTC m=+1399.173042713" Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.928638 4886 generic.go:334] "Generic (PLEG): container finished" podID="0858eab0-1359-49c7-89eb-fe94498572dc" containerID="b757828f2baf86f4332302026e8b16e0764dbf5f974bf0e717e1a84d0de7ff1d" exitCode=0 Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.928695 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dwk2t" event={"ID":"0858eab0-1359-49c7-89eb-fe94498572dc","Type":"ContainerDied","Data":"b757828f2baf86f4332302026e8b16e0764dbf5f974bf0e717e1a84d0de7ff1d"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.928718 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dwk2t" event={"ID":"0858eab0-1359-49c7-89eb-fe94498572dc","Type":"ContainerStarted","Data":"0219b392f59ac0617755a3d69904171694eb82b1270ee04f4bce909d9b25b170"} Mar 14 08:51:03 crc kubenswrapper[4886]: I0314 08:51:03.956092 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" podStartSLOduration=2.95606933 podStartE2EDuration="2.95606933s" podCreationTimestamp="2026-03-14 08:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:03.937551674 +0000 UTC m=+1399.186003311" watchObservedRunningTime="2026-03-14 08:51:03.95606933 +0000 UTC m=+1399.204520967" Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.965028 4886 generic.go:334] "Generic (PLEG): container finished" podID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerID="eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014" exitCode=137 Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.965564 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" event={"ID":"7a298745-dd74-4ed3-b21b-648f2adb47dc","Type":"ContainerDied","Data":"eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014"} Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.968807 4886 generic.go:334] "Generic (PLEG): container finished" podID="fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" containerID="05b71a3caa442f9cde91a7c9fa6129c19bb28eae6c5f2f5f65b9df2b28dbcb00" exitCode=0 Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.968970 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" event={"ID":"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a","Type":"ContainerDied","Data":"05b71a3caa442f9cde91a7c9fa6129c19bb28eae6c5f2f5f65b9df2b28dbcb00"} Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.975451 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerStarted","Data":"95fb4b296c3f6707ce932813c589ce8a7e283297377abdb5a8ece77bbdbaefd8"} Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.975478 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerStarted","Data":"2bc974c1152dd23a058733114421ca1b35b7799bb8399375201cf04df6621401"} Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.981030 4886 generic.go:334] "Generic (PLEG): container finished" podID="f21173e4-e8f0-46b1-8c84-0453259409aa" containerID="d940b6b18724b28bd807f13bd9359d9b3e231166274fcedb906e68db0afdf259" exitCode=0 Mar 14 08:51:04 crc kubenswrapper[4886]: I0314 08:51:04.981162 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" event={"ID":"f21173e4-e8f0-46b1-8c84-0453259409aa","Type":"ContainerDied","Data":"d940b6b18724b28bd807f13bd9359d9b3e231166274fcedb906e68db0afdf259"} Mar 14 08:51:05 crc kubenswrapper[4886]: E0314 08:51:05.015050 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a298745_dd74_4ed3_b21b_648f2adb47dc.slice/crio-conmon-eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a298745_dd74_4ed3_b21b_648f2adb47dc.slice/crio-eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a9ffec0_3aa9_46a6_87b9_aadc1021683c.slice/crio-3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a9ffec0_3aa9_46a6_87b9_aadc1021683c.slice/crio-conmon-3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a745438_cb17_4626_96ed_51c7de75a976.slice/crio-34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf\": RecentStats: unable to find data in memory cache]" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.204689 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.337784 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data\") pod \"7a298745-dd74-4ed3-b21b-648f2adb47dc\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.338308 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a298745-dd74-4ed3-b21b-648f2adb47dc-logs\") pod \"7a298745-dd74-4ed3-b21b-648f2adb47dc\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.338385 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls5pv\" (UniqueName: \"kubernetes.io/projected/7a298745-dd74-4ed3-b21b-648f2adb47dc-kube-api-access-ls5pv\") pod \"7a298745-dd74-4ed3-b21b-648f2adb47dc\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.338569 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data-custom\") pod \"7a298745-dd74-4ed3-b21b-648f2adb47dc\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.338666 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-combined-ca-bundle\") pod \"7a298745-dd74-4ed3-b21b-648f2adb47dc\" (UID: \"7a298745-dd74-4ed3-b21b-648f2adb47dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.339829 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a298745-dd74-4ed3-b21b-648f2adb47dc-logs" (OuterVolumeSpecName: "logs") pod "7a298745-dd74-4ed3-b21b-648f2adb47dc" (UID: "7a298745-dd74-4ed3-b21b-648f2adb47dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.347973 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a298745-dd74-4ed3-b21b-648f2adb47dc-kube-api-access-ls5pv" (OuterVolumeSpecName: "kube-api-access-ls5pv") pod "7a298745-dd74-4ed3-b21b-648f2adb47dc" (UID: "7a298745-dd74-4ed3-b21b-648f2adb47dc"). InnerVolumeSpecName "kube-api-access-ls5pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.348372 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a298745-dd74-4ed3-b21b-648f2adb47dc" (UID: "7a298745-dd74-4ed3-b21b-648f2adb47dc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.448487 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a298745-dd74-4ed3-b21b-648f2adb47dc-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.448524 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls5pv\" (UniqueName: \"kubernetes.io/projected/7a298745-dd74-4ed3-b21b-648f2adb47dc-kube-api-access-ls5pv\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.448538 4886 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.468263 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data" (OuterVolumeSpecName: "config-data") pod "7a298745-dd74-4ed3-b21b-648f2adb47dc" (UID: "7a298745-dd74-4ed3-b21b-648f2adb47dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.470284 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a298745-dd74-4ed3-b21b-648f2adb47dc" (UID: "7a298745-dd74-4ed3-b21b-648f2adb47dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.532035 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.564152 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.564357 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a298745-dd74-4ed3-b21b-648f2adb47dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.656965 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.667394 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h9hz\" (UniqueName: \"kubernetes.io/projected/639641ff-f838-43ec-bc7a-c8313a5dc254-kube-api-access-8h9hz\") pod \"639641ff-f838-43ec-bc7a-c8313a5dc254\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.667633 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639641ff-f838-43ec-bc7a-c8313a5dc254-operator-scripts\") pod \"639641ff-f838-43ec-bc7a-c8313a5dc254\" (UID: \"639641ff-f838-43ec-bc7a-c8313a5dc254\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.669698 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/639641ff-f838-43ec-bc7a-c8313a5dc254-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "639641ff-f838-43ec-bc7a-c8313a5dc254" (UID: "639641ff-f838-43ec-bc7a-c8313a5dc254"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.674466 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/639641ff-f838-43ec-bc7a-c8313a5dc254-kube-api-access-8h9hz" (OuterVolumeSpecName: "kube-api-access-8h9hz") pod "639641ff-f838-43ec-bc7a-c8313a5dc254" (UID: "639641ff-f838-43ec-bc7a-c8313a5dc254"). InnerVolumeSpecName "kube-api-access-8h9hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.684566 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.760548 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.769768 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm4wd\" (UniqueName: \"kubernetes.io/projected/c7dae5e3-c486-4f68-bc83-54cda54cd52b-kube-api-access-gm4wd\") pod \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.769805 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7dae5e3-c486-4f68-bc83-54cda54cd52b-operator-scripts\") pod \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\" (UID: \"c7dae5e3-c486-4f68-bc83-54cda54cd52b\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.770232 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h9hz\" (UniqueName: \"kubernetes.io/projected/639641ff-f838-43ec-bc7a-c8313a5dc254-kube-api-access-8h9hz\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.770248 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/639641ff-f838-43ec-bc7a-c8313a5dc254-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.770557 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7dae5e3-c486-4f68-bc83-54cda54cd52b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7dae5e3-c486-4f68-bc83-54cda54cd52b" (UID: "c7dae5e3-c486-4f68-bc83-54cda54cd52b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.783037 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dae5e3-c486-4f68-bc83-54cda54cd52b-kube-api-access-gm4wd" (OuterVolumeSpecName: "kube-api-access-gm4wd") pod "c7dae5e3-c486-4f68-bc83-54cda54cd52b" (UID: "c7dae5e3-c486-4f68-bc83-54cda54cd52b"). InnerVolumeSpecName "kube-api-access-gm4wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.871062 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-operator-scripts\") pod \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.871189 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4kfr\" (UniqueName: \"kubernetes.io/projected/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-kube-api-access-g4kfr\") pod \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\" (UID: \"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.871240 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0858eab0-1359-49c7-89eb-fe94498572dc-operator-scripts\") pod \"0858eab0-1359-49c7-89eb-fe94498572dc\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.871347 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nklb2\" (UniqueName: \"kubernetes.io/projected/0858eab0-1359-49c7-89eb-fe94498572dc-kube-api-access-nklb2\") pod \"0858eab0-1359-49c7-89eb-fe94498572dc\" (UID: \"0858eab0-1359-49c7-89eb-fe94498572dc\") " Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.871950 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm4wd\" (UniqueName: \"kubernetes.io/projected/c7dae5e3-c486-4f68-bc83-54cda54cd52b-kube-api-access-gm4wd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.871973 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7dae5e3-c486-4f68-bc83-54cda54cd52b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.872254 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" (UID: "874c5fb0-7bf3-46a1-9be1-71bc8b49cb38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.873223 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0858eab0-1359-49c7-89eb-fe94498572dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0858eab0-1359-49c7-89eb-fe94498572dc" (UID: "0858eab0-1359-49c7-89eb-fe94498572dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.877890 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-kube-api-access-g4kfr" (OuterVolumeSpecName: "kube-api-access-g4kfr") pod "874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" (UID: "874c5fb0-7bf3-46a1-9be1-71bc8b49cb38"). InnerVolumeSpecName "kube-api-access-g4kfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.878247 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0858eab0-1359-49c7-89eb-fe94498572dc-kube-api-access-nklb2" (OuterVolumeSpecName: "kube-api-access-nklb2") pod "0858eab0-1359-49c7-89eb-fe94498572dc" (UID: "0858eab0-1359-49c7-89eb-fe94498572dc"). InnerVolumeSpecName "kube-api-access-nklb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.974813 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4kfr\" (UniqueName: \"kubernetes.io/projected/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-kube-api-access-g4kfr\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.974847 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0858eab0-1359-49c7-89eb-fe94498572dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.974858 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nklb2\" (UniqueName: \"kubernetes.io/projected/0858eab0-1359-49c7-89eb-fe94498572dc-kube-api-access-nklb2\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.974867 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.998370 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" event={"ID":"7a298745-dd74-4ed3-b21b-648f2adb47dc","Type":"ContainerDied","Data":"02ee9603f36134be041b74b32a2e353de7de8e3225d14a3c06ad1659dc31d5ba"} Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.998420 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d4575995d-lfmv5" Mar 14 08:51:05 crc kubenswrapper[4886]: I0314 08:51:05.998419 4886 scope.go:117] "RemoveContainer" containerID="eb6ed3dbcff207337c0705046c6c84b1904c5d533460cdcfc2aa5b1e1cc12014" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.001945 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-dwk2t" event={"ID":"0858eab0-1359-49c7-89eb-fe94498572dc","Type":"ContainerDied","Data":"0219b392f59ac0617755a3d69904171694eb82b1270ee04f4bce909d9b25b170"} Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.001973 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0219b392f59ac0617755a3d69904171694eb82b1270ee04f4bce909d9b25b170" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.002030 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-dwk2t" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.009990 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w8c9q" event={"ID":"639641ff-f838-43ec-bc7a-c8313a5dc254","Type":"ContainerDied","Data":"a684a52cab07b2b6f6317c2a4f1db8a5e8b72460da9285d8c762ddcccf66c987"} Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.010024 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a684a52cab07b2b6f6317c2a4f1db8a5e8b72460da9285d8c762ddcccf66c987" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.010539 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w8c9q" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.014766 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xwt5m" event={"ID":"c7dae5e3-c486-4f68-bc83-54cda54cd52b","Type":"ContainerDied","Data":"f7306286d32f35bcba30fd65b121a59ce644913729614db5e169f5044fc4bcdd"} Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.014792 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7306286d32f35bcba30fd65b121a59ce644913729614db5e169f5044fc4bcdd" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.014841 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xwt5m" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.019352 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8303-account-create-update-w2kk2" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.020012 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8303-account-create-update-w2kk2" event={"ID":"874c5fb0-7bf3-46a1-9be1-71bc8b49cb38","Type":"ContainerDied","Data":"6df2bf6c8bbe971842d8114e2b7448a52e709f8603da402fca0e36e5c77456f3"} Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.020053 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df2bf6c8bbe971842d8114e2b7448a52e709f8603da402fca0e36e5c77456f3" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.039228 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-d4575995d-lfmv5"] Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.039654 4886 scope.go:117] "RemoveContainer" containerID="a774d7ecff810007d347300dd5019b4603ff3a9eff33e055d7e88e41b6cbfc9e" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.048325 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-d4575995d-lfmv5"] Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.357991 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.453926 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.490778 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74zhh\" (UniqueName: \"kubernetes.io/projected/f21173e4-e8f0-46b1-8c84-0453259409aa-kube-api-access-74zhh\") pod \"f21173e4-e8f0-46b1-8c84-0453259409aa\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.491354 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f21173e4-e8f0-46b1-8c84-0453259409aa-operator-scripts\") pod \"f21173e4-e8f0-46b1-8c84-0453259409aa\" (UID: \"f21173e4-e8f0-46b1-8c84-0453259409aa\") " Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.492583 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21173e4-e8f0-46b1-8c84-0453259409aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f21173e4-e8f0-46b1-8c84-0453259409aa" (UID: "f21173e4-e8f0-46b1-8c84-0453259409aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.498463 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21173e4-e8f0-46b1-8c84-0453259409aa-kube-api-access-74zhh" (OuterVolumeSpecName: "kube-api-access-74zhh") pod "f21173e4-e8f0-46b1-8c84-0453259409aa" (UID: "f21173e4-e8f0-46b1-8c84-0453259409aa"). InnerVolumeSpecName "kube-api-access-74zhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.593670 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7859\" (UniqueName: \"kubernetes.io/projected/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-kube-api-access-w7859\") pod \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.593775 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-operator-scripts\") pod \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\" (UID: \"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a\") " Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.594336 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f21173e4-e8f0-46b1-8c84-0453259409aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.594359 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74zhh\" (UniqueName: \"kubernetes.io/projected/f21173e4-e8f0-46b1-8c84-0453259409aa-kube-api-access-74zhh\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.595364 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" (UID: "fdaed1d0-b39b-4d77-8b14-4b9cb1de478a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.597245 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-kube-api-access-w7859" (OuterVolumeSpecName: "kube-api-access-w7859") pod "fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" (UID: "fdaed1d0-b39b-4d77-8b14-4b9cb1de478a"). InnerVolumeSpecName "kube-api-access-w7859". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.697498 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7859\" (UniqueName: \"kubernetes.io/projected/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-kube-api-access-w7859\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:06 crc kubenswrapper[4886]: I0314 08:51:06.697530 4886 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.037035 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerStarted","Data":"34151a332fad6f6a36c8d715367f9797b0568169af1bac1eb52a50a1ae26e7f4"} Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.037254 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-central-agent" containerID="cri-o://41e68ca14d6b71ce8564e3e7f5dd30016b98c576a9aee808aa822604150b0e1f" gracePeriod=30 Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.037559 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="sg-core" containerID="cri-o://95fb4b296c3f6707ce932813c589ce8a7e283297377abdb5a8ece77bbdbaefd8" gracePeriod=30 Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.037575 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-notification-agent" containerID="cri-o://2bc974c1152dd23a058733114421ca1b35b7799bb8399375201cf04df6621401" gracePeriod=30 Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.037633 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.037641 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="proxy-httpd" containerID="cri-o://34151a332fad6f6a36c8d715367f9797b0568169af1bac1eb52a50a1ae26e7f4" gracePeriod=30 Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.044710 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" event={"ID":"f21173e4-e8f0-46b1-8c84-0453259409aa","Type":"ContainerDied","Data":"a171e30da288f6d0d664a3544b55ea4a60540d28fbbd3eca034c0310b0ef8523"} Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.044761 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a171e30da288f6d0d664a3544b55ea4a60540d28fbbd3eca034c0310b0ef8523" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.044789 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-113e-account-create-update-mjr4j" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.057741 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" event={"ID":"fdaed1d0-b39b-4d77-8b14-4b9cb1de478a","Type":"ContainerDied","Data":"473fead87b2dfb316b1d14ef8d1a650ee3577d6ae3b1adec25f3e3aff2d8d3c6"} Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.057783 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="473fead87b2dfb316b1d14ef8d1a650ee3577d6ae3b1adec25f3e3aff2d8d3c6" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.057836 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4e72-account-create-update-tpngn" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.337776 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.932179801 podStartE2EDuration="7.337757466s" podCreationTimestamp="2026-03-14 08:51:00 +0000 UTC" firstStartedPulling="2026-03-14 08:51:01.811534636 +0000 UTC m=+1397.059986273" lastFinishedPulling="2026-03-14 08:51:06.217112301 +0000 UTC m=+1401.465563938" observedRunningTime="2026-03-14 08:51:07.070607242 +0000 UTC m=+1402.319058899" watchObservedRunningTime="2026-03-14 08:51:07.337757466 +0000 UTC m=+1402.586209103" Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.340008 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.340237 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-log" containerID="cri-o://96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f" gracePeriod=30 Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.340676 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-httpd" containerID="cri-o://47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe" gracePeriod=30 Mar 14 08:51:07 crc kubenswrapper[4886]: I0314 08:51:07.447799 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" path="/var/lib/kubelet/pods/7a298745-dd74-4ed3-b21b-648f2adb47dc/volumes" Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.069032 4886 generic.go:334] "Generic (PLEG): container finished" podID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerID="34151a332fad6f6a36c8d715367f9797b0568169af1bac1eb52a50a1ae26e7f4" exitCode=0 Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.069070 4886 generic.go:334] "Generic (PLEG): container finished" podID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerID="95fb4b296c3f6707ce932813c589ce8a7e283297377abdb5a8ece77bbdbaefd8" exitCode=2 Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.069080 4886 generic.go:334] "Generic (PLEG): container finished" podID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerID="2bc974c1152dd23a058733114421ca1b35b7799bb8399375201cf04df6621401" exitCode=0 Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.069106 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerDied","Data":"34151a332fad6f6a36c8d715367f9797b0568169af1bac1eb52a50a1ae26e7f4"} Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.069165 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerDied","Data":"95fb4b296c3f6707ce932813c589ce8a7e283297377abdb5a8ece77bbdbaefd8"} Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.069176 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerDied","Data":"2bc974c1152dd23a058733114421ca1b35b7799bb8399375201cf04df6621401"} Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.071003 4886 generic.go:334] "Generic (PLEG): container finished" podID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerID="96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f" exitCode=143 Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.071147 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e6e3f18-a42d-4d01-8df0-6dfc736974fc","Type":"ContainerDied","Data":"96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f"} Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.145929 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66c6bc56b6-25jn4" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.568777 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.569311 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-log" containerID="cri-o://631c2e8c5eeadab20304b857dacd13e38e482252eba22abb961e0ca93f650019" gracePeriod=30 Mar 14 08:51:08 crc kubenswrapper[4886]: I0314 08:51:08.569462 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-httpd" containerID="cri-o://1b4bce9c22bd8d147e8d50bf51f21a60bbb27d8a53b19b69ed2ada0d4426195e" gracePeriod=30 Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.083005 4886 generic.go:334] "Generic (PLEG): container finished" podID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerID="631c2e8c5eeadab20304b857dacd13e38e482252eba22abb961e0ca93f650019" exitCode=143 Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.083078 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c271fab-7815-4aab-86c5-3e3919077e2e","Type":"ContainerDied","Data":"631c2e8c5eeadab20304b857dacd13e38e482252eba22abb961e0ca93f650019"} Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.085207 4886 generic.go:334] "Generic (PLEG): container finished" podID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerID="73d43bea51b4f69093e0094852fed6e07a712fa8984fafd978f07dad7e19c2f5" exitCode=0 Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.085253 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64796d9fb-7nw8p" event={"ID":"4a9ffec0-3aa9-46a6-87b9-aadc1021683c","Type":"ContainerDied","Data":"73d43bea51b4f69093e0094852fed6e07a712fa8984fafd978f07dad7e19c2f5"} Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.520014 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.528641 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-768658d555-ttc2f" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.840576 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.860864 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-config\") pod \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.860975 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-combined-ca-bundle\") pod \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.861871 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx2bc\" (UniqueName: \"kubernetes.io/projected/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-kube-api-access-xx2bc\") pod \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.862235 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-ovndb-tls-certs\") pod \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.862301 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-httpd-config\") pod \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\" (UID: \"4a9ffec0-3aa9-46a6-87b9-aadc1021683c\") " Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.919208 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4a9ffec0-3aa9-46a6-87b9-aadc1021683c" (UID: "4a9ffec0-3aa9-46a6-87b9-aadc1021683c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.919790 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-kube-api-access-xx2bc" (OuterVolumeSpecName: "kube-api-access-xx2bc") pod "4a9ffec0-3aa9-46a6-87b9-aadc1021683c" (UID: "4a9ffec0-3aa9-46a6-87b9-aadc1021683c"). InnerVolumeSpecName "kube-api-access-xx2bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.940831 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9ffec0-3aa9-46a6-87b9-aadc1021683c" (UID: "4a9ffec0-3aa9-46a6-87b9-aadc1021683c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.961149 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-config" (OuterVolumeSpecName: "config") pod "4a9ffec0-3aa9-46a6-87b9-aadc1021683c" (UID: "4a9ffec0-3aa9-46a6-87b9-aadc1021683c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.967429 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.967466 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.967484 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx2bc\" (UniqueName: \"kubernetes.io/projected/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-kube-api-access-xx2bc\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.967497 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:09 crc kubenswrapper[4886]: I0314 08:51:09.989535 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4a9ffec0-3aa9-46a6-87b9-aadc1021683c" (UID: "4a9ffec0-3aa9-46a6-87b9-aadc1021683c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.069705 4886 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9ffec0-3aa9-46a6-87b9-aadc1021683c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.097444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64796d9fb-7nw8p" event={"ID":"4a9ffec0-3aa9-46a6-87b9-aadc1021683c","Type":"ContainerDied","Data":"4b32ab325479c6c85640a603a65bf2a4fd21e761a66b0f89c9e174e3f823d63d"} Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.097506 4886 scope.go:117] "RemoveContainer" containerID="3bf2cf0d70f5e796ba25e67afed2c4623142c6adabfe77613df16f974770bf8a" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.097522 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64796d9fb-7nw8p" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.131437 4886 scope.go:117] "RemoveContainer" containerID="73d43bea51b4f69093e0094852fed6e07a712fa8984fafd978f07dad7e19c2f5" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.133256 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64796d9fb-7nw8p"] Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.142893 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64796d9fb-7nw8p"] Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.975020 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988106 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-config-data\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988164 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-httpd-run\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988219 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-combined-ca-bundle\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988242 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-scripts\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988279 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nrv4\" (UniqueName: \"kubernetes.io/projected/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-kube-api-access-6nrv4\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988653 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.988992 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-logs\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.989204 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-public-tls-certs\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.989306 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\" (UID: \"2e6e3f18-a42d-4d01-8df0-6dfc736974fc\") " Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.989725 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:10 crc kubenswrapper[4886]: I0314 08:51:10.992512 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-logs" (OuterVolumeSpecName: "logs") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.001417 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-kube-api-access-6nrv4" (OuterVolumeSpecName: "kube-api-access-6nrv4") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "kube-api-access-6nrv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.002335 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.019312 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-scripts" (OuterVolumeSpecName: "scripts") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.083786 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.094108 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-config-data" (OuterVolumeSpecName: "config-data") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.095311 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.095341 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.095350 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.095360 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.095368 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nrv4\" (UniqueName: \"kubernetes.io/projected/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-kube-api-access-6nrv4\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.095378 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.122278 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e6e3f18-a42d-4d01-8df0-6dfc736974fc" (UID: "2e6e3f18-a42d-4d01-8df0-6dfc736974fc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.136267 4886 generic.go:334] "Generic (PLEG): container finished" podID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerID="47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe" exitCode=0 Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.136308 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e6e3f18-a42d-4d01-8df0-6dfc736974fc","Type":"ContainerDied","Data":"47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe"} Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.136342 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e6e3f18-a42d-4d01-8df0-6dfc736974fc","Type":"ContainerDied","Data":"8724e40188eace52b94c6fae5458724d2da8f36a9f87806d291011b9defc7745"} Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.136361 4886 scope.go:117] "RemoveContainer" containerID="47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.136362 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.174868 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.201215 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6e3f18-a42d-4d01-8df0-6dfc736974fc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.201245 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.206600 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.213765 4886 scope.go:117] "RemoveContainer" containerID="96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.231812 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.239357 4886 scope.go:117] "RemoveContainer" containerID="47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.244310 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe\": container with ID starting with 47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe not found: ID does not exist" containerID="47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.244353 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe"} err="failed to get container status \"47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe\": rpc error: code = NotFound desc = could not find container \"47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe\": container with ID starting with 47606a45613cf7114a76c400c1843cb813019eb670c001e863dc21927c75e3fe not found: ID does not exist" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.244377 4886 scope.go:117] "RemoveContainer" containerID="96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.244714 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f\": container with ID starting with 96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f not found: ID does not exist" containerID="96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.245144 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f"} err="failed to get container status \"96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f\": rpc error: code = NotFound desc = could not find container \"96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f\": container with ID starting with 96f8cdebff573a8c3e09bc3e2053ceb6cdb6393a88427bd750d6b1506ae2441f not found: ID does not exist" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.262770 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263199 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263220 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263240 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dae5e3-c486-4f68-bc83-54cda54cd52b" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263247 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dae5e3-c486-4f68-bc83-54cda54cd52b" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263270 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="639641ff-f838-43ec-bc7a-c8313a5dc254" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263279 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="639641ff-f838-43ec-bc7a-c8313a5dc254" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263294 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-httpd" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263302 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-httpd" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263312 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-httpd" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263320 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-httpd" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263331 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21173e4-e8f0-46b1-8c84-0453259409aa" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263336 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21173e4-e8f0-46b1-8c84-0453259409aa" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263344 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-api" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263349 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-api" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263362 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263369 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263379 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0858eab0-1359-49c7-89eb-fe94498572dc" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263386 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0858eab0-1359-49c7-89eb-fe94498572dc" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263396 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener-log" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263403 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener-log" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263424 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263434 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: E0314 08:51:11.263444 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-log" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263451 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-log" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263666 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263685 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dae5e3-c486-4f68-bc83-54cda54cd52b" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263696 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a298745-dd74-4ed3-b21b-648f2adb47dc" containerName="barbican-keystone-listener-log" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263706 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263719 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0858eab0-1359-49c7-89eb-fe94498572dc" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263737 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-log" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263747 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-api" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263757 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" containerName="neutron-httpd" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263770 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263782 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="639641ff-f838-43ec-bc7a-c8313a5dc254" containerName="mariadb-database-create" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263791 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" containerName="glance-httpd" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.263803 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21173e4-e8f0-46b1-8c84-0453259409aa" containerName="mariadb-account-create-update" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.264857 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.271849 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.272036 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.273002 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5p2\" (UniqueName: \"kubernetes.io/projected/90c0e7b8-3991-44c6-b013-f55286cc08ff-kube-api-access-zq5p2\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405062 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405082 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-config-data\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405112 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c0e7b8-3991-44c6-b013-f55286cc08ff-logs\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405192 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-scripts\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405233 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c0e7b8-3991-44c6-b013-f55286cc08ff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.405276 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.433004 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6e3f18-a42d-4d01-8df0-6dfc736974fc" path="/var/lib/kubelet/pods/2e6e3f18-a42d-4d01-8df0-6dfc736974fc/volumes" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.433668 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9ffec0-3aa9-46a6-87b9-aadc1021683c" path="/var/lib/kubelet/pods/4a9ffec0-3aa9-46a6-87b9-aadc1021683c/volumes" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507163 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507249 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5p2\" (UniqueName: \"kubernetes.io/projected/90c0e7b8-3991-44c6-b013-f55286cc08ff-kube-api-access-zq5p2\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507284 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507307 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-config-data\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507334 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507369 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c0e7b8-3991-44c6-b013-f55286cc08ff-logs\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507399 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-scripts\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.507443 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c0e7b8-3991-44c6-b013-f55286cc08ff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.508002 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90c0e7b8-3991-44c6-b013-f55286cc08ff-logs\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.508007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90c0e7b8-3991-44c6-b013-f55286cc08ff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.508095 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.512087 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.512552 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.512914 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-scripts\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.518358 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c0e7b8-3991-44c6-b013-f55286cc08ff-config-data\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.525218 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5p2\" (UniqueName: \"kubernetes.io/projected/90c0e7b8-3991-44c6-b013-f55286cc08ff-kube-api-access-zq5p2\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.537535 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"90c0e7b8-3991-44c6-b013-f55286cc08ff\") " pod="openstack/glance-default-external-api-0" Mar 14 08:51:11 crc kubenswrapper[4886]: I0314 08:51:11.587880 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.138873 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.158437 4886 generic.go:334] "Generic (PLEG): container finished" podID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerID="1b4bce9c22bd8d147e8d50bf51f21a60bbb27d8a53b19b69ed2ada0d4426195e" exitCode=0 Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.158498 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c271fab-7815-4aab-86c5-3e3919077e2e","Type":"ContainerDied","Data":"1b4bce9c22bd8d147e8d50bf51f21a60bbb27d8a53b19b69ed2ada0d4426195e"} Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.185520 4886 generic.go:334] "Generic (PLEG): container finished" podID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerID="41e68ca14d6b71ce8564e3e7f5dd30016b98c576a9aee808aa822604150b0e1f" exitCode=0 Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.185562 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerDied","Data":"41e68ca14d6b71ce8564e3e7f5dd30016b98c576a9aee808aa822604150b0e1f"} Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.293342 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6rrxl"] Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.295307 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.299236 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.299449 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.299980 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xd4mm" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.308411 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6rrxl"] Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.345880 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.429584 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-config-data\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.429661 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kcnq\" (UniqueName: \"kubernetes.io/projected/18707ca6-9c35-482a-a186-da58e60b7540-kube-api-access-6kcnq\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.429708 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-scripts\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.429751 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.489465 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.530771 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-internal-tls-certs\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.530815 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-httpd-run\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.530842 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-combined-ca-bundle\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.530880 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-config-data\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.530951 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-logs\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.530974 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49dn8\" (UniqueName: \"kubernetes.io/projected/8c271fab-7815-4aab-86c5-3e3919077e2e-kube-api-access-49dn8\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.531256 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-scripts\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.531290 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8c271fab-7815-4aab-86c5-3e3919077e2e\" (UID: \"8c271fab-7815-4aab-86c5-3e3919077e2e\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.531506 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-config-data\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.531563 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kcnq\" (UniqueName: \"kubernetes.io/projected/18707ca6-9c35-482a-a186-da58e60b7540-kube-api-access-6kcnq\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.531593 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-scripts\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.531631 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.532876 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-logs" (OuterVolumeSpecName: "logs") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.543810 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.546082 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c271fab-7815-4aab-86c5-3e3919077e2e-kube-api-access-49dn8" (OuterVolumeSpecName: "kube-api-access-49dn8") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "kube-api-access-49dn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.546993 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-scripts\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.548410 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-config-data\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.549869 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.549932 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-scripts" (OuterVolumeSpecName: "scripts") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.550428 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.555941 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kcnq\" (UniqueName: \"kubernetes.io/projected/18707ca6-9c35-482a-a186-da58e60b7540-kube-api-access-6kcnq\") pod \"nova-cell0-conductor-db-sync-6rrxl\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.606856 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.632579 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-run-httpd\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.632841 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p69rh\" (UniqueName: \"kubernetes.io/projected/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-kube-api-access-p69rh\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.632883 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-sg-core-conf-yaml\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.632973 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-scripts\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.633007 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-combined-ca-bundle\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.633028 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-log-httpd\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.633068 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-config-data\") pod \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\" (UID: \"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98\") " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.633356 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634264 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634712 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634731 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634777 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634790 4886 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634800 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.634810 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c271fab-7815-4aab-86c5-3e3919077e2e-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.635390 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49dn8\" (UniqueName: \"kubernetes.io/projected/8c271fab-7815-4aab-86c5-3e3919077e2e-kube-api-access-49dn8\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.635411 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.636761 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-config-data" (OuterVolumeSpecName: "config-data") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.641325 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-kube-api-access-p69rh" (OuterVolumeSpecName: "kube-api-access-p69rh") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "kube-api-access-p69rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.641419 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-scripts" (OuterVolumeSpecName: "scripts") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.642889 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.658081 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c271fab-7815-4aab-86c5-3e3919077e2e" (UID: "8c271fab-7815-4aab-86c5-3e3919077e2e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.668343 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.694869 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.728496 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736697 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736728 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736741 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736751 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736762 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c271fab-7815-4aab-86c5-3e3919077e2e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736771 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p69rh\" (UniqueName: \"kubernetes.io/projected/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-kube-api-access-p69rh\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.736780 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.772024 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-config-data" (OuterVolumeSpecName: "config-data") pod "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" (UID: "5e8bae2d-4b7c-46c3-af57-f65ec5efbc98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:12 crc kubenswrapper[4886]: I0314 08:51:12.838439 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.144547 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6rrxl"] Mar 14 08:51:13 crc kubenswrapper[4886]: W0314 08:51:13.150731 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18707ca6_9c35_482a_a186_da58e60b7540.slice/crio-cfc672140b398344c16e415bf28add43198030ba2d3fa3f645b983b5556e3fa6 WatchSource:0}: Error finding container cfc672140b398344c16e415bf28add43198030ba2d3fa3f645b983b5556e3fa6: Status 404 returned error can't find the container with id cfc672140b398344c16e415bf28add43198030ba2d3fa3f645b983b5556e3fa6 Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.216007 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" event={"ID":"18707ca6-9c35-482a-a186-da58e60b7540","Type":"ContainerStarted","Data":"cfc672140b398344c16e415bf28add43198030ba2d3fa3f645b983b5556e3fa6"} Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.219884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e8bae2d-4b7c-46c3-af57-f65ec5efbc98","Type":"ContainerDied","Data":"dbb7d6aeebbaf30259bc6e6be0e95910c7355d8fe7cc034d674c50445ecb5669"} Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.219941 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.219949 4886 scope.go:117] "RemoveContainer" containerID="34151a332fad6f6a36c8d715367f9797b0568169af1bac1eb52a50a1ae26e7f4" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.224081 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c271fab-7815-4aab-86c5-3e3919077e2e","Type":"ContainerDied","Data":"68b4492fc3239b01393e297469c729e37087d2813b6527f73b955bab7d082404"} Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.224160 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.232206 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90c0e7b8-3991-44c6-b013-f55286cc08ff","Type":"ContainerStarted","Data":"0844fc89b5604984b34fd6ab33708cc50d6e98eaa595f70d8ca8cd3878a92f06"} Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.232260 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90c0e7b8-3991-44c6-b013-f55286cc08ff","Type":"ContainerStarted","Data":"79eb5b7358cf19e623b3426bf093f621873bcbdf4fa55e141303b25c89660bde"} Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.259902 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.260749 4886 scope.go:117] "RemoveContainer" containerID="95fb4b296c3f6707ce932813c589ce8a7e283297377abdb5a8ece77bbdbaefd8" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.320559 4886 scope.go:117] "RemoveContainer" containerID="2bc974c1152dd23a058733114421ca1b35b7799bb8399375201cf04df6621401" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.333086 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.349380 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: E0314 08:51:13.350374 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-httpd" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.350394 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-httpd" Mar 14 08:51:13 crc kubenswrapper[4886]: E0314 08:51:13.350417 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-central-agent" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.350430 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-central-agent" Mar 14 08:51:13 crc kubenswrapper[4886]: E0314 08:51:13.350454 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="sg-core" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.350462 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="sg-core" Mar 14 08:51:13 crc kubenswrapper[4886]: E0314 08:51:13.350487 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="proxy-httpd" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.350494 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="proxy-httpd" Mar 14 08:51:13 crc kubenswrapper[4886]: E0314 08:51:13.350526 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-log" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.350534 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-log" Mar 14 08:51:13 crc kubenswrapper[4886]: E0314 08:51:13.350567 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-notification-agent" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.350576 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-notification-agent" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.351016 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-httpd" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.351042 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="sg-core" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.351080 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-central-agent" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.351090 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="ceilometer-notification-agent" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.351144 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" containerName="glance-log" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.351164 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" containerName="proxy-httpd" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.370458 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.378504 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.378770 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.381917 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.396653 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.397274 4886 scope.go:117] "RemoveContainer" containerID="41e68ca14d6b71ce8564e3e7f5dd30016b98c576a9aee808aa822604150b0e1f" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.409656 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.420647 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.428317 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.431036 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.431101 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.441744 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8bae2d-4b7c-46c3-af57-f65ec5efbc98" path="/var/lib/kubelet/pods/5e8bae2d-4b7c-46c3-af57-f65ec5efbc98/volumes" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.441850 4886 scope.go:117] "RemoveContainer" containerID="1b4bce9c22bd8d147e8d50bf51f21a60bbb27d8a53b19b69ed2ada0d4426195e" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.444051 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c271fab-7815-4aab-86c5-3e3919077e2e" path="/var/lib/kubelet/pods/8c271fab-7815-4aab-86c5-3e3919077e2e/volumes" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.445402 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.466606 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-log-httpd\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.466721 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.466810 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-scripts\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.466845 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-run-httpd\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.466904 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.466968 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-config-data\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.467169 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cgf\" (UniqueName: \"kubernetes.io/projected/73ebf9d0-186c-4236-9af0-821ec1b0e4db-kube-api-access-87cgf\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.476443 4886 scope.go:117] "RemoveContainer" containerID="631c2e8c5eeadab20304b857dacd13e38e482252eba22abb961e0ca93f650019" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.568885 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e64d41-fe1b-453e-807e-b3e94a62a804-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.569110 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-scripts\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.569338 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-run-httpd\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.569662 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.569843 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmgc\" (UniqueName: \"kubernetes.io/projected/d9e64d41-fe1b-453e-807e-b3e94a62a804-kube-api-access-xhmgc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.570002 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-config-data\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.570146 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cgf\" (UniqueName: \"kubernetes.io/projected/73ebf9d0-186c-4236-9af0-821ec1b0e4db-kube-api-access-87cgf\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572035 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572112 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572180 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572216 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-log-httpd\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572227 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-run-httpd\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572240 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572346 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572472 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.572497 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9e64d41-fe1b-453e-807e-b3e94a62a804-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.573156 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-log-httpd\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.580002 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.580713 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.581660 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-scripts\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.582764 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-config-data\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.597949 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cgf\" (UniqueName: \"kubernetes.io/projected/73ebf9d0-186c-4236-9af0-821ec1b0e4db-kube-api-access-87cgf\") pod \"ceilometer-0\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674456 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674504 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674527 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674556 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674594 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9e64d41-fe1b-453e-807e-b3e94a62a804-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674614 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e64d41-fe1b-453e-807e-b3e94a62a804-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674692 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhmgc\" (UniqueName: \"kubernetes.io/projected/d9e64d41-fe1b-453e-807e-b3e94a62a804-kube-api-access-xhmgc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674754 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.674888 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.675710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9e64d41-fe1b-453e-807e-b3e94a62a804-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.675935 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9e64d41-fe1b-453e-807e-b3e94a62a804-logs\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.679149 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.679167 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.691170 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.691619 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9e64d41-fe1b-453e-807e-b3e94a62a804-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.698559 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhmgc\" (UniqueName: \"kubernetes.io/projected/d9e64d41-fe1b-453e-807e-b3e94a62a804-kube-api-access-xhmgc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.713119 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d9e64d41-fe1b-453e-807e-b3e94a62a804\") " pod="openstack/glance-default-internal-api-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.725101 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:13 crc kubenswrapper[4886]: I0314 08:51:13.762690 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.232845 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:14 crc kubenswrapper[4886]: W0314 08:51:14.249328 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73ebf9d0_186c_4236_9af0_821ec1b0e4db.slice/crio-719a10a1ecb003ebf73688c554b74f92ca25938e97a30aaee11401c98301ee95 WatchSource:0}: Error finding container 719a10a1ecb003ebf73688c554b74f92ca25938e97a30aaee11401c98301ee95: Status 404 returned error can't find the container with id 719a10a1ecb003ebf73688c554b74f92ca25938e97a30aaee11401c98301ee95 Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.257381 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"90c0e7b8-3991-44c6-b013-f55286cc08ff","Type":"ContainerStarted","Data":"2b6d47ec5a90ecb8dee4e47e69f9b6dddd458e63e9e128af09499ab9c6e40f09"} Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.277483 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.277424694 podStartE2EDuration="3.277424694s" podCreationTimestamp="2026-03-14 08:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:14.276475057 +0000 UTC m=+1409.524926694" watchObservedRunningTime="2026-03-14 08:51:14.277424694 +0000 UTC m=+1409.525876331" Mar 14 08:51:14 crc kubenswrapper[4886]: W0314 08:51:14.442929 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9e64d41_fe1b_453e_807e_b3e94a62a804.slice/crio-397c5a57ba68328bb165ae45647669a1bed60e567689b3834c8a90725166a3c1 WatchSource:0}: Error finding container 397c5a57ba68328bb165ae45647669a1bed60e567689b3834c8a90725166a3c1: Status 404 returned error can't find the container with id 397c5a57ba68328bb165ae45647669a1bed60e567689b3834c8a90725166a3c1 Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.445681 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.649813 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696399 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-tls-certs\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696511 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-secret-key\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696547 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-scripts\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696622 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-combined-ca-bundle\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696711 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-config-data\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696771 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvwmh\" (UniqueName: \"kubernetes.io/projected/3f8100ac-c606-4eb3-afd6-07be9de44f42-kube-api-access-xvwmh\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.696825 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8100ac-c606-4eb3-afd6-07be9de44f42-logs\") pod \"3f8100ac-c606-4eb3-afd6-07be9de44f42\" (UID: \"3f8100ac-c606-4eb3-afd6-07be9de44f42\") " Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.698080 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f8100ac-c606-4eb3-afd6-07be9de44f42-logs" (OuterVolumeSpecName: "logs") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.702810 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8100ac-c606-4eb3-afd6-07be9de44f42-kube-api-access-xvwmh" (OuterVolumeSpecName: "kube-api-access-xvwmh") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "kube-api-access-xvwmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.703075 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.725477 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-scripts" (OuterVolumeSpecName: "scripts") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.741046 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-config-data" (OuterVolumeSpecName: "config-data") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.742060 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.779152 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "3f8100ac-c606-4eb3-afd6-07be9de44f42" (UID: "3f8100ac-c606-4eb3-afd6-07be9de44f42"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799588 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799618 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvwmh\" (UniqueName: \"kubernetes.io/projected/3f8100ac-c606-4eb3-afd6-07be9de44f42-kube-api-access-xvwmh\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799630 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f8100ac-c606-4eb3-afd6-07be9de44f42-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799639 4886 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799649 4886 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799657 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f8100ac-c606-4eb3-afd6-07be9de44f42-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:14 crc kubenswrapper[4886]: I0314 08:51:14.799667 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8100ac-c606-4eb3-afd6-07be9de44f42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:15 crc kubenswrapper[4886]: E0314 08:51:15.263832 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a745438_cb17_4626_96ed_51c7de75a976.slice/crio-34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf\": RecentStats: unable to find data in memory cache]" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.270060 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerStarted","Data":"719a10a1ecb003ebf73688c554b74f92ca25938e97a30aaee11401c98301ee95"} Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.272039 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9e64d41-fe1b-453e-807e-b3e94a62a804","Type":"ContainerStarted","Data":"c2a054996eab95c36736d904a78018d3963f8dd1b9964d199701ef2311e26e21"} Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.272078 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9e64d41-fe1b-453e-807e-b3e94a62a804","Type":"ContainerStarted","Data":"397c5a57ba68328bb165ae45647669a1bed60e567689b3834c8a90725166a3c1"} Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.274481 4886 generic.go:334] "Generic (PLEG): container finished" podID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerID="d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc" exitCode=137 Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.274526 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c6bc56b6-25jn4" event={"ID":"3f8100ac-c606-4eb3-afd6-07be9de44f42","Type":"ContainerDied","Data":"d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc"} Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.274746 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c6bc56b6-25jn4" event={"ID":"3f8100ac-c606-4eb3-afd6-07be9de44f42","Type":"ContainerDied","Data":"cdf73c46710d22075f2126e643228c7cb08bbeea14c43229d6d437e527c964a0"} Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.274538 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66c6bc56b6-25jn4" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.274775 4886 scope.go:117] "RemoveContainer" containerID="844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.337886 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66c6bc56b6-25jn4"] Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.348879 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66c6bc56b6-25jn4"] Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.439841 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" path="/var/lib/kubelet/pods/3f8100ac-c606-4eb3-afd6-07be9de44f42/volumes" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.553191 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.569227 4886 scope.go:117] "RemoveContainer" containerID="d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.621761 4886 scope.go:117] "RemoveContainer" containerID="844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57" Mar 14 08:51:15 crc kubenswrapper[4886]: E0314 08:51:15.624722 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57\": container with ID starting with 844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57 not found: ID does not exist" containerID="844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.624779 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57"} err="failed to get container status \"844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57\": rpc error: code = NotFound desc = could not find container \"844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57\": container with ID starting with 844e2a0189883ab734eb2b66c54a57447201e522d5837d8fbe5fc822ee14df57 not found: ID does not exist" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.624811 4886 scope.go:117] "RemoveContainer" containerID="d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc" Mar 14 08:51:15 crc kubenswrapper[4886]: E0314 08:51:15.626493 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc\": container with ID starting with d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc not found: ID does not exist" containerID="d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc" Mar 14 08:51:15 crc kubenswrapper[4886]: I0314 08:51:15.626537 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc"} err="failed to get container status \"d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc\": rpc error: code = NotFound desc = could not find container \"d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc\": container with ID starting with d5c20c7ccfa0dac83302b656a5fa7355d00d8cae8b6c3a4e27263c26c15716cc not found: ID does not exist" Mar 14 08:51:16 crc kubenswrapper[4886]: I0314 08:51:16.293903 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerStarted","Data":"392c2eb03365881be657992e4a96125dcbebff3b67dcc23ccd2f69b843290d50"} Mar 14 08:51:16 crc kubenswrapper[4886]: I0314 08:51:16.294384 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerStarted","Data":"0c84c5c78f9fc6834ff674960e4a9e69eca599144056d8d5edf7fce36760a11e"} Mar 14 08:51:16 crc kubenswrapper[4886]: I0314 08:51:16.298308 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d9e64d41-fe1b-453e-807e-b3e94a62a804","Type":"ContainerStarted","Data":"08db69cefeff4cc16465b8611a38566564dc3cdfe63560e19c693e9637dc30a3"} Mar 14 08:51:16 crc kubenswrapper[4886]: I0314 08:51:16.332845 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.332824647 podStartE2EDuration="3.332824647s" podCreationTimestamp="2026-03-14 08:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:16.322240207 +0000 UTC m=+1411.570691874" watchObservedRunningTime="2026-03-14 08:51:16.332824647 +0000 UTC m=+1411.581276294" Mar 14 08:51:17 crc kubenswrapper[4886]: I0314 08:51:17.315415 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerStarted","Data":"0b271cd3bc2037d7c929d5b5e6bd1566c4ecec0be5a0094010de6895e43e0c7f"} Mar 14 08:51:21 crc kubenswrapper[4886]: I0314 08:51:21.142492 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:51:21 crc kubenswrapper[4886]: I0314 08:51:21.143261 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="cf9e5273-6d41-439e-98a2-263c64a3b39b" containerName="watcher-decision-engine" containerID="cri-o://86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316" gracePeriod=30 Mar 14 08:51:21 crc kubenswrapper[4886]: I0314 08:51:21.588766 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 08:51:21 crc kubenswrapper[4886]: I0314 08:51:21.588822 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 14 08:51:21 crc kubenswrapper[4886]: I0314 08:51:21.628293 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 08:51:21 crc kubenswrapper[4886]: I0314 08:51:21.640369 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 14 08:51:22 crc kubenswrapper[4886]: I0314 08:51:22.387503 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 08:51:22 crc kubenswrapper[4886]: I0314 08:51:22.387787 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 14 08:51:23 crc kubenswrapper[4886]: I0314 08:51:23.763069 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:23 crc kubenswrapper[4886]: I0314 08:51:23.763429 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:23 crc kubenswrapper[4886]: I0314 08:51:23.797755 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:23 crc kubenswrapper[4886]: I0314 08:51:23.807334 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.407428 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" event={"ID":"18707ca6-9c35-482a-a186-da58e60b7540","Type":"ContainerStarted","Data":"ba18efa88eb6c20a44b891f4747f36acc1447f1ab84ac4cf7820a3240379480e"} Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.410606 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerStarted","Data":"51699cabba0b86ef2fdbc6c70cfaf95d0bf6d915e40fa430ce1757a991a1b072"} Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.410884 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.410911 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.411209 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="sg-core" containerID="cri-o://0b271cd3bc2037d7c929d5b5e6bd1566c4ecec0be5a0094010de6895e43e0c7f" gracePeriod=30 Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.411216 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-notification-agent" containerID="cri-o://392c2eb03365881be657992e4a96125dcbebff3b67dcc23ccd2f69b843290d50" gracePeriod=30 Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.411219 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="proxy-httpd" containerID="cri-o://51699cabba0b86ef2fdbc6c70cfaf95d0bf6d915e40fa430ce1757a991a1b072" gracePeriod=30 Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.411219 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-central-agent" containerID="cri-o://0c84c5c78f9fc6834ff674960e4a9e69eca599144056d8d5edf7fce36760a11e" gracePeriod=30 Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.432278 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" podStartSLOduration=1.7400145569999999 podStartE2EDuration="12.432256951s" podCreationTimestamp="2026-03-14 08:51:12 +0000 UTC" firstStartedPulling="2026-03-14 08:51:13.155802221 +0000 UTC m=+1408.404253858" lastFinishedPulling="2026-03-14 08:51:23.848044615 +0000 UTC m=+1419.096496252" observedRunningTime="2026-03-14 08:51:24.423813342 +0000 UTC m=+1419.672264979" watchObservedRunningTime="2026-03-14 08:51:24.432256951 +0000 UTC m=+1419.680708588" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.456217 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.878477648 podStartE2EDuration="11.456196241s" podCreationTimestamp="2026-03-14 08:51:13 +0000 UTC" firstStartedPulling="2026-03-14 08:51:14.251361314 +0000 UTC m=+1409.499812951" lastFinishedPulling="2026-03-14 08:51:23.829079907 +0000 UTC m=+1419.077531544" observedRunningTime="2026-03-14 08:51:24.446606989 +0000 UTC m=+1419.695058626" watchObservedRunningTime="2026-03-14 08:51:24.456196241 +0000 UTC m=+1419.704647878" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.671949 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.672063 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:51:24 crc kubenswrapper[4886]: I0314 08:51:24.677174 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 14 08:51:25 crc kubenswrapper[4886]: I0314 08:51:25.430307 4886 generic.go:334] "Generic (PLEG): container finished" podID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerID="51699cabba0b86ef2fdbc6c70cfaf95d0bf6d915e40fa430ce1757a991a1b072" exitCode=0 Mar 14 08:51:25 crc kubenswrapper[4886]: I0314 08:51:25.430622 4886 generic.go:334] "Generic (PLEG): container finished" podID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerID="0b271cd3bc2037d7c929d5b5e6bd1566c4ecec0be5a0094010de6895e43e0c7f" exitCode=2 Mar 14 08:51:25 crc kubenswrapper[4886]: I0314 08:51:25.430631 4886 generic.go:334] "Generic (PLEG): container finished" podID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerID="392c2eb03365881be657992e4a96125dcbebff3b67dcc23ccd2f69b843290d50" exitCode=0 Mar 14 08:51:25 crc kubenswrapper[4886]: I0314 08:51:25.456420 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerDied","Data":"51699cabba0b86ef2fdbc6c70cfaf95d0bf6d915e40fa430ce1757a991a1b072"} Mar 14 08:51:25 crc kubenswrapper[4886]: I0314 08:51:25.456472 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerDied","Data":"0b271cd3bc2037d7c929d5b5e6bd1566c4ecec0be5a0094010de6895e43e0c7f"} Mar 14 08:51:25 crc kubenswrapper[4886]: I0314 08:51:25.456486 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerDied","Data":"392c2eb03365881be657992e4a96125dcbebff3b67dcc23ccd2f69b843290d50"} Mar 14 08:51:25 crc kubenswrapper[4886]: E0314 08:51:25.583193 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a745438_cb17_4626_96ed_51c7de75a976.slice/crio-34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf\": RecentStats: unable to find data in memory cache]" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.444558 4886 generic.go:334] "Generic (PLEG): container finished" podID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerID="0c84c5c78f9fc6834ff674960e4a9e69eca599144056d8d5edf7fce36760a11e" exitCode=0 Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.444615 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerDied","Data":"0c84c5c78f9fc6834ff674960e4a9e69eca599144056d8d5edf7fce36760a11e"} Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.444639 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73ebf9d0-186c-4236-9af0-821ec1b0e4db","Type":"ContainerDied","Data":"719a10a1ecb003ebf73688c554b74f92ca25938e97a30aaee11401c98301ee95"} Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.444652 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719a10a1ecb003ebf73688c554b74f92ca25938e97a30aaee11401c98301ee95" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.575872 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.612726 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.612855 4886 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.617208 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659564 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-combined-ca-bundle\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659627 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-config-data\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659658 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-log-httpd\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659749 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-sg-core-conf-yaml\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659802 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-scripts\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659841 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87cgf\" (UniqueName: \"kubernetes.io/projected/73ebf9d0-186c-4236-9af0-821ec1b0e4db-kube-api-access-87cgf\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.659885 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-run-httpd\") pod \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\" (UID: \"73ebf9d0-186c-4236-9af0-821ec1b0e4db\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.660927 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.661857 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.675350 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-scripts" (OuterVolumeSpecName: "scripts") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.680817 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ebf9d0-186c-4236-9af0-821ec1b0e4db-kube-api-access-87cgf" (OuterVolumeSpecName: "kube-api-access-87cgf") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "kube-api-access-87cgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.755913 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.763031 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.763060 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.763069 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87cgf\" (UniqueName: \"kubernetes.io/projected/73ebf9d0-186c-4236-9af0-821ec1b0e4db-kube-api-access-87cgf\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.763080 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.763088 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73ebf9d0-186c-4236-9af0-821ec1b0e4db-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.802488 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.823598 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.835403 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-config-data" (OuterVolumeSpecName: "config-data") pod "73ebf9d0-186c-4236-9af0-821ec1b0e4db" (UID: "73ebf9d0-186c-4236-9af0-821ec1b0e4db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.866955 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-custom-prometheus-ca\") pod \"cf9e5273-6d41-439e-98a2-263c64a3b39b\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.867018 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76fgt\" (UniqueName: \"kubernetes.io/projected/cf9e5273-6d41-439e-98a2-263c64a3b39b-kube-api-access-76fgt\") pod \"cf9e5273-6d41-439e-98a2-263c64a3b39b\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.867163 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-combined-ca-bundle\") pod \"cf9e5273-6d41-439e-98a2-263c64a3b39b\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.867199 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9e5273-6d41-439e-98a2-263c64a3b39b-logs\") pod \"cf9e5273-6d41-439e-98a2-263c64a3b39b\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.867222 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-config-data\") pod \"cf9e5273-6d41-439e-98a2-263c64a3b39b\" (UID: \"cf9e5273-6d41-439e-98a2-263c64a3b39b\") " Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.867652 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.867669 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ebf9d0-186c-4236-9af0-821ec1b0e4db-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.871015 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf9e5273-6d41-439e-98a2-263c64a3b39b-logs" (OuterVolumeSpecName: "logs") pod "cf9e5273-6d41-439e-98a2-263c64a3b39b" (UID: "cf9e5273-6d41-439e-98a2-263c64a3b39b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.878391 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9e5273-6d41-439e-98a2-263c64a3b39b-kube-api-access-76fgt" (OuterVolumeSpecName: "kube-api-access-76fgt") pod "cf9e5273-6d41-439e-98a2-263c64a3b39b" (UID: "cf9e5273-6d41-439e-98a2-263c64a3b39b"). InnerVolumeSpecName "kube-api-access-76fgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.901756 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "cf9e5273-6d41-439e-98a2-263c64a3b39b" (UID: "cf9e5273-6d41-439e-98a2-263c64a3b39b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.902186 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf9e5273-6d41-439e-98a2-263c64a3b39b" (UID: "cf9e5273-6d41-439e-98a2-263c64a3b39b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.931277 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-config-data" (OuterVolumeSpecName: "config-data") pod "cf9e5273-6d41-439e-98a2-263c64a3b39b" (UID: "cf9e5273-6d41-439e-98a2-263c64a3b39b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.969847 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.969884 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf9e5273-6d41-439e-98a2-263c64a3b39b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.969894 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.969902 4886 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/cf9e5273-6d41-439e-98a2-263c64a3b39b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:26 crc kubenswrapper[4886]: I0314 08:51:26.969912 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76fgt\" (UniqueName: \"kubernetes.io/projected/cf9e5273-6d41-439e-98a2-263c64a3b39b-kube-api-access-76fgt\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.456024 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf9e5273-6d41-439e-98a2-263c64a3b39b" containerID="86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316" exitCode=0 Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.457137 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.457550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"cf9e5273-6d41-439e-98a2-263c64a3b39b","Type":"ContainerDied","Data":"86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316"} Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.457577 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"cf9e5273-6d41-439e-98a2-263c64a3b39b","Type":"ContainerDied","Data":"fbcc930d9cc58764687ece8f172f12d2771b34a57db49adb3c11e62f5d2b5723"} Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.457593 4886 scope.go:117] "RemoveContainer" containerID="86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.457695 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.487966 4886 scope.go:117] "RemoveContainer" containerID="86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.488514 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316\": container with ID starting with 86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316 not found: ID does not exist" containerID="86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.488555 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316"} err="failed to get container status \"86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316\": rpc error: code = NotFound desc = could not find container \"86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316\": container with ID starting with 86378c1e127706a7e9da49d37044a06cd0930fa54daf3cab70a3d9e7c6d19316 not found: ID does not exist" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.503759 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.522363 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.537950 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.553692 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563016 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563453 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="proxy-httpd" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563469 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="proxy-httpd" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563489 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563496 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563510 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon-log" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563516 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon-log" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563534 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="sg-core" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563541 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="sg-core" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563554 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-central-agent" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563560 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-central-agent" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563579 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-notification-agent" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563586 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-notification-agent" Mar 14 08:51:27 crc kubenswrapper[4886]: E0314 08:51:27.563595 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9e5273-6d41-439e-98a2-263c64a3b39b" containerName="watcher-decision-engine" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563601 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9e5273-6d41-439e-98a2-263c64a3b39b" containerName="watcher-decision-engine" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563806 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-central-agent" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563826 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="ceilometer-notification-agent" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563835 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563847 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8100ac-c606-4eb3-afd6-07be9de44f42" containerName="horizon-log" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563858 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9e5273-6d41-439e-98a2-263c64a3b39b" containerName="watcher-decision-engine" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563866 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="sg-core" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.563880 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" containerName="proxy-httpd" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.565696 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.572590 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.573001 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.574195 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.578044 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.579639 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.609920 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.636709 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.683403 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.683857 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-run-httpd\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.684134 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-scripts\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.684278 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-logs\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.684639 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7598\" (UniqueName: \"kubernetes.io/projected/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-kube-api-access-c7598\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.684814 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-log-httpd\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.684933 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.685061 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.685267 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b64r\" (UniqueName: \"kubernetes.io/projected/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-kube-api-access-4b64r\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.685407 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-config-data\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.685500 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.685595 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.787773 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-config-data\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.789776 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.789815 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.789853 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.790315 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-run-httpd\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.790422 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-scripts\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.790483 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-logs\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.790968 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-run-httpd\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791027 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7598\" (UniqueName: \"kubernetes.io/projected/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-kube-api-access-c7598\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791071 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-log-httpd\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791101 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791165 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791230 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b64r\" (UniqueName: \"kubernetes.io/projected/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-kube-api-access-4b64r\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791484 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-log-httpd\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.791692 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-logs\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.798170 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.798640 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-config-data\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.800396 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-scripts\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.800969 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.802715 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.808271 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.812510 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b64r\" (UniqueName: \"kubernetes.io/projected/f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5-kube-api-access-4b64r\") pod \"watcher-decision-engine-0\" (UID: \"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5\") " pod="openstack/watcher-decision-engine-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.814787 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7598\" (UniqueName: \"kubernetes.io/projected/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-kube-api-access-c7598\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.820984 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.938570 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:51:27 crc kubenswrapper[4886]: I0314 08:51:27.970221 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:28 crc kubenswrapper[4886]: I0314 08:51:28.461078 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:51:28 crc kubenswrapper[4886]: W0314 08:51:28.465947 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bddb3e5_33a1_4ebb_8f67_5e12a9b902d5.slice/crio-143b81d7d35eaa424f6b2dee6761e9723e96daf551138bccb0cb0ef82399b9f3 WatchSource:0}: Error finding container 143b81d7d35eaa424f6b2dee6761e9723e96daf551138bccb0cb0ef82399b9f3: Status 404 returned error can't find the container with id 143b81d7d35eaa424f6b2dee6761e9723e96daf551138bccb0cb0ef82399b9f3 Mar 14 08:51:28 crc kubenswrapper[4886]: I0314 08:51:28.565620 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Mar 14 08:51:28 crc kubenswrapper[4886]: W0314 08:51:28.576283 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf77ddcfc_3cbd_49a1_8f1b_d9de60483fc5.slice/crio-abe62ebb83e2a26284c9f6df00da0080b6479de66ed5bed69768ac910850826a WatchSource:0}: Error finding container abe62ebb83e2a26284c9f6df00da0080b6479de66ed5bed69768ac910850826a: Status 404 returned error can't find the container with id abe62ebb83e2a26284c9f6df00da0080b6479de66ed5bed69768ac910850826a Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.431337 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ebf9d0-186c-4236-9af0-821ec1b0e4db" path="/var/lib/kubelet/pods/73ebf9d0-186c-4236-9af0-821ec1b0e4db/volumes" Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.432494 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9e5273-6d41-439e-98a2-263c64a3b39b" path="/var/lib/kubelet/pods/cf9e5273-6d41-439e-98a2-263c64a3b39b/volumes" Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.482056 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5","Type":"ContainerStarted","Data":"eda56d0f3f3dc88fef879f13beb48e1bd84e06b285a34b6000335ceffdad9560"} Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.482108 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5","Type":"ContainerStarted","Data":"abe62ebb83e2a26284c9f6df00da0080b6479de66ed5bed69768ac910850826a"} Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.483732 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerStarted","Data":"3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07"} Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.483850 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerStarted","Data":"143b81d7d35eaa424f6b2dee6761e9723e96daf551138bccb0cb0ef82399b9f3"} Mar 14 08:51:29 crc kubenswrapper[4886]: I0314 08:51:29.505754 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.505737229 podStartE2EDuration="2.505737229s" podCreationTimestamp="2026-03-14 08:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:29.498556375 +0000 UTC m=+1424.747008032" watchObservedRunningTime="2026-03-14 08:51:29.505737229 +0000 UTC m=+1424.754188866" Mar 14 08:51:30 crc kubenswrapper[4886]: I0314 08:51:30.498096 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerStarted","Data":"d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb"} Mar 14 08:51:30 crc kubenswrapper[4886]: I0314 08:51:30.498367 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerStarted","Data":"0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde"} Mar 14 08:51:32 crc kubenswrapper[4886]: I0314 08:51:32.515057 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerStarted","Data":"2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0"} Mar 14 08:51:32 crc kubenswrapper[4886]: I0314 08:51:32.515718 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:51:32 crc kubenswrapper[4886]: I0314 08:51:32.547969 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.33529429 podStartE2EDuration="5.547951897s" podCreationTimestamp="2026-03-14 08:51:27 +0000 UTC" firstStartedPulling="2026-03-14 08:51:28.46845396 +0000 UTC m=+1423.716905607" lastFinishedPulling="2026-03-14 08:51:31.681111577 +0000 UTC m=+1426.929563214" observedRunningTime="2026-03-14 08:51:32.539352783 +0000 UTC m=+1427.787804420" watchObservedRunningTime="2026-03-14 08:51:32.547951897 +0000 UTC m=+1427.796403534" Mar 14 08:51:35 crc kubenswrapper[4886]: E0314 08:51:35.854748 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a745438_cb17_4626_96ed_51c7de75a976.slice/crio-34a8e812ae52f0153cc28bb93f8075bc5850e81a6a08c0f6de662f2c45e13fbf\": RecentStats: unable to find data in memory cache]" Mar 14 08:51:37 crc kubenswrapper[4886]: I0314 08:51:37.971115 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:37 crc kubenswrapper[4886]: I0314 08:51:37.998807 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:38 crc kubenswrapper[4886]: I0314 08:51:38.572404 4886 generic.go:334] "Generic (PLEG): container finished" podID="18707ca6-9c35-482a-a186-da58e60b7540" containerID="ba18efa88eb6c20a44b891f4747f36acc1447f1ab84ac4cf7820a3240379480e" exitCode=0 Mar 14 08:51:38 crc kubenswrapper[4886]: I0314 08:51:38.572454 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" event={"ID":"18707ca6-9c35-482a-a186-da58e60b7540","Type":"ContainerDied","Data":"ba18efa88eb6c20a44b891f4747f36acc1447f1ab84ac4cf7820a3240379480e"} Mar 14 08:51:38 crc kubenswrapper[4886]: I0314 08:51:38.572798 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:38 crc kubenswrapper[4886]: I0314 08:51:38.608759 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Mar 14 08:51:39 crc kubenswrapper[4886]: I0314 08:51:39.915994 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.058017 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-scripts\") pod \"18707ca6-9c35-482a-a186-da58e60b7540\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.058089 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kcnq\" (UniqueName: \"kubernetes.io/projected/18707ca6-9c35-482a-a186-da58e60b7540-kube-api-access-6kcnq\") pod \"18707ca6-9c35-482a-a186-da58e60b7540\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.058247 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-config-data\") pod \"18707ca6-9c35-482a-a186-da58e60b7540\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.058481 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-combined-ca-bundle\") pod \"18707ca6-9c35-482a-a186-da58e60b7540\" (UID: \"18707ca6-9c35-482a-a186-da58e60b7540\") " Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.063699 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-scripts" (OuterVolumeSpecName: "scripts") pod "18707ca6-9c35-482a-a186-da58e60b7540" (UID: "18707ca6-9c35-482a-a186-da58e60b7540"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.064331 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18707ca6-9c35-482a-a186-da58e60b7540-kube-api-access-6kcnq" (OuterVolumeSpecName: "kube-api-access-6kcnq") pod "18707ca6-9c35-482a-a186-da58e60b7540" (UID: "18707ca6-9c35-482a-a186-da58e60b7540"). InnerVolumeSpecName "kube-api-access-6kcnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.084592 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18707ca6-9c35-482a-a186-da58e60b7540" (UID: "18707ca6-9c35-482a-a186-da58e60b7540"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.097980 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-config-data" (OuterVolumeSpecName: "config-data") pod "18707ca6-9c35-482a-a186-da58e60b7540" (UID: "18707ca6-9c35-482a-a186-da58e60b7540"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.160970 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.161008 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.161018 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kcnq\" (UniqueName: \"kubernetes.io/projected/18707ca6-9c35-482a-a186-da58e60b7540-kube-api-access-6kcnq\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.161028 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18707ca6-9c35-482a-a186-da58e60b7540-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.593268 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" event={"ID":"18707ca6-9c35-482a-a186-da58e60b7540","Type":"ContainerDied","Data":"cfc672140b398344c16e415bf28add43198030ba2d3fa3f645b983b5556e3fa6"} Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.593315 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfc672140b398344c16e415bf28add43198030ba2d3fa3f645b983b5556e3fa6" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.593320 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6rrxl" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.744407 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 08:51:40 crc kubenswrapper[4886]: E0314 08:51:40.744992 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18707ca6-9c35-482a-a186-da58e60b7540" containerName="nova-cell0-conductor-db-sync" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.745012 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="18707ca6-9c35-482a-a186-da58e60b7540" containerName="nova-cell0-conductor-db-sync" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.745266 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="18707ca6-9c35-482a-a186-da58e60b7540" containerName="nova-cell0-conductor-db-sync" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.746009 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.748638 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xd4mm" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.748989 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.757963 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.876328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d09abe-fc85-42fe-ac82-95af478b8985-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.876384 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d09abe-fc85-42fe-ac82-95af478b8985-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.876592 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cfhb\" (UniqueName: \"kubernetes.io/projected/c8d09abe-fc85-42fe-ac82-95af478b8985-kube-api-access-4cfhb\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.979325 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d09abe-fc85-42fe-ac82-95af478b8985-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.979377 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d09abe-fc85-42fe-ac82-95af478b8985-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.979435 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cfhb\" (UniqueName: \"kubernetes.io/projected/c8d09abe-fc85-42fe-ac82-95af478b8985-kube-api-access-4cfhb\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.990889 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d09abe-fc85-42fe-ac82-95af478b8985-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.991977 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d09abe-fc85-42fe-ac82-95af478b8985-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:40 crc kubenswrapper[4886]: I0314 08:51:40.999654 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cfhb\" (UniqueName: \"kubernetes.io/projected/c8d09abe-fc85-42fe-ac82-95af478b8985-kube-api-access-4cfhb\") pod \"nova-cell0-conductor-0\" (UID: \"c8d09abe-fc85-42fe-ac82-95af478b8985\") " pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:41 crc kubenswrapper[4886]: I0314 08:51:41.067640 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:41 crc kubenswrapper[4886]: I0314 08:51:41.505082 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 14 08:51:41 crc kubenswrapper[4886]: W0314 08:51:41.516838 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d09abe_fc85_42fe_ac82_95af478b8985.slice/crio-75dcb9857fa999daaf7f80220ba24d8122768b249b33d58118b6dfdcd4648274 WatchSource:0}: Error finding container 75dcb9857fa999daaf7f80220ba24d8122768b249b33d58118b6dfdcd4648274: Status 404 returned error can't find the container with id 75dcb9857fa999daaf7f80220ba24d8122768b249b33d58118b6dfdcd4648274 Mar 14 08:51:41 crc kubenswrapper[4886]: I0314 08:51:41.606156 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c8d09abe-fc85-42fe-ac82-95af478b8985","Type":"ContainerStarted","Data":"75dcb9857fa999daaf7f80220ba24d8122768b249b33d58118b6dfdcd4648274"} Mar 14 08:51:42 crc kubenswrapper[4886]: I0314 08:51:42.620557 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c8d09abe-fc85-42fe-ac82-95af478b8985","Type":"ContainerStarted","Data":"339ea5850a412877a43519f9e83e6a22bd6d41f3bf01396746215ff502ced962"} Mar 14 08:51:42 crc kubenswrapper[4886]: I0314 08:51:42.621135 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:42 crc kubenswrapper[4886]: I0314 08:51:42.644107 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.644091608 podStartE2EDuration="2.644091608s" podCreationTimestamp="2026-03-14 08:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:42.636215315 +0000 UTC m=+1437.884666952" watchObservedRunningTime="2026-03-14 08:51:42.644091608 +0000 UTC m=+1437.892543235" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.109982 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.588633 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pt4tz"] Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.590184 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.593392 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.602330 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pt4tz"] Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.605217 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.705905 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-scripts\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.706512 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.706618 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-config-data\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.706883 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbf55\" (UniqueName: \"kubernetes.io/projected/60e62d5f-645e-4ef7-adb0-bedd550ade7e-kube-api-access-lbf55\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.795035 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.796840 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.799724 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.811922 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbf55\" (UniqueName: \"kubernetes.io/projected/60e62d5f-645e-4ef7-adb0-bedd550ade7e-kube-api-access-lbf55\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.811994 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-scripts\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.812053 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.812163 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-config-data\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.818869 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-config-data\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.820344 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-scripts\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.842636 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.842875 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.877788 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbf55\" (UniqueName: \"kubernetes.io/projected/60e62d5f-645e-4ef7-adb0-bedd550ade7e-kube-api-access-lbf55\") pod \"nova-cell0-cell-mapping-pt4tz\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.881014 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.882953 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.899427 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.916242 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.916289 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aeedb7c-42c2-4739-a5df-01d4e7be5499-logs\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.916321 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sd5d\" (UniqueName: \"kubernetes.io/projected/7aeedb7c-42c2-4739-a5df-01d4e7be5499-kube-api-access-6sd5d\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.916384 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-config-data\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.927756 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:46 crc kubenswrapper[4886]: I0314 08:51:46.967086 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.019792 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-config-data\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.019928 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-config-data\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.019973 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.020004 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.020021 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aeedb7c-42c2-4739-a5df-01d4e7be5499-logs\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.020046 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxkqb\" (UniqueName: \"kubernetes.io/projected/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-kube-api-access-lxkqb\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.020066 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sd5d\" (UniqueName: \"kubernetes.io/projected/7aeedb7c-42c2-4739-a5df-01d4e7be5499-kube-api-access-6sd5d\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.023558 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aeedb7c-42c2-4739-a5df-01d4e7be5499-logs\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.033303 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.033426 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.033844 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-config-data\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.036351 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.062459 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130245 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130621 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-config-data\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130685 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhzt9\" (UniqueName: \"kubernetes.io/projected/629d9381-9a9a-4320-9175-c3bcfb2e509b-kube-api-access-rhzt9\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130720 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130798 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxkqb\" (UniqueName: \"kubernetes.io/projected/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-kube-api-access-lxkqb\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130957 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-config-data\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.130983 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/629d9381-9a9a-4320-9175-c3bcfb2e509b-logs\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.183530 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.235398 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.235510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhzt9\" (UniqueName: \"kubernetes.io/projected/629d9381-9a9a-4320-9175-c3bcfb2e509b-kube-api-access-rhzt9\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.235642 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-config-data\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.235665 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/629d9381-9a9a-4320-9175-c3bcfb2e509b-logs\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.236448 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/629d9381-9a9a-4320-9175-c3bcfb2e509b-logs\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.271160 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.272354 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.286458 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.299412 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.384271 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-vd5jl"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.387845 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.459621 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtj5m\" (UniqueName: \"kubernetes.io/projected/b76b6262-2fc6-46a4-abeb-5a380338b6f6-kube-api-access-vtj5m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.459728 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.459834 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.498586 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.498752 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sd5d\" (UniqueName: \"kubernetes.io/projected/7aeedb7c-42c2-4739-a5df-01d4e7be5499-kube-api-access-6sd5d\") pod \"nova-api-0\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.502513 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhzt9\" (UniqueName: \"kubernetes.io/projected/629d9381-9a9a-4320-9175-c3bcfb2e509b-kube-api-access-rhzt9\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.503085 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-config-data\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.503637 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.504202 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxkqb\" (UniqueName: \"kubernetes.io/projected/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-kube-api-access-lxkqb\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.519502 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-config-data\") pod \"nova-scheduler-0\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.544990 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.553818 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568027 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568104 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568143 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568196 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-svc\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568304 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtj5m\" (UniqueName: \"kubernetes.io/projected/b76b6262-2fc6-46a4-abeb-5a380338b6f6-kube-api-access-vtj5m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568342 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-config\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568444 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.568558 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rkx6\" (UniqueName: \"kubernetes.io/projected/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-kube-api-access-4rkx6\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.572837 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.576405 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.582690 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-vd5jl"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.620581 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtj5m\" (UniqueName: \"kubernetes.io/projected/b76b6262-2fc6-46a4-abeb-5a380338b6f6-kube-api-access-vtj5m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.627461 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.670321 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.670496 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.670532 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.670575 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-svc\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.670663 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-config\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.670786 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rkx6\" (UniqueName: \"kubernetes.io/projected/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-kube-api-access-4rkx6\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.672481 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.673168 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.681779 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.682430 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-svc\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.683081 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-config\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.702522 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rkx6\" (UniqueName: \"kubernetes.io/projected/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-kube-api-access-4rkx6\") pod \"dnsmasq-dns-865f5d856f-vd5jl\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.724507 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.813065 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pt4tz"] Mar 14 08:51:47 crc kubenswrapper[4886]: I0314 08:51:47.947620 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.307201 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.396034 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.444461 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.575921 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:51:48 crc kubenswrapper[4886]: W0314 08:51:48.592525 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aeedb7c_42c2_4739_a5df_01d4e7be5499.slice/crio-9d6ff2e2c3b1840ac5a31accc5ea2299bc042f4cb95fffdefcc5a201bc93a695 WatchSource:0}: Error finding container 9d6ff2e2c3b1840ac5a31accc5ea2299bc042f4cb95fffdefcc5a201bc93a695: Status 404 returned error can't find the container with id 9d6ff2e2c3b1840ac5a31accc5ea2299bc042f4cb95fffdefcc5a201bc93a695 Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.660553 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-vd5jl"] Mar 14 08:51:48 crc kubenswrapper[4886]: W0314 08:51:48.665654 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fa7a09a_b8ef_4cdc_a4ce_93287d730311.slice/crio-d46d00f96eb1be0391afb0324f1796b9d4b658dd46723bd128b3e4ed843344b6 WatchSource:0}: Error finding container d46d00f96eb1be0391afb0324f1796b9d4b658dd46723bd128b3e4ed843344b6: Status 404 returned error can't find the container with id d46d00f96eb1be0391afb0324f1796b9d4b658dd46723bd128b3e4ed843344b6 Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.690173 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggzbn"] Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.691672 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.695557 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.695945 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.698761 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggzbn"] Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.721484 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7aeedb7c-42c2-4739-a5df-01d4e7be5499","Type":"ContainerStarted","Data":"9d6ff2e2c3b1840ac5a31accc5ea2299bc042f4cb95fffdefcc5a201bc93a695"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.725803 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b76b6262-2fc6-46a4-abeb-5a380338b6f6","Type":"ContainerStarted","Data":"37c6c60ac1376cab6802c9f3c611e2987a7b984747018aa9b3bc114540d33e94"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.727541 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"629d9381-9a9a-4320-9175-c3bcfb2e509b","Type":"ContainerStarted","Data":"a1672a7660a5b7f9847f3919c7e7a1fcf1bfa7fdec9d4628a5336233d0336d86"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.728803 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07","Type":"ContainerStarted","Data":"b82582ca96288593e337a004c597d3281346ceab1c6bb1a173413fdca79c1b1b"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.735960 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pt4tz" event={"ID":"60e62d5f-645e-4ef7-adb0-bedd550ade7e","Type":"ContainerStarted","Data":"c209e61eecdb4137304724149ddf492fa6c6d038b5cb91f572d1b065a6fde594"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.736009 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pt4tz" event={"ID":"60e62d5f-645e-4ef7-adb0-bedd550ade7e","Type":"ContainerStarted","Data":"f51ebb453ff5def349e24be7aff52e06d01670b6e3d660e7adef87d7dca5599e"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.738861 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" event={"ID":"9fa7a09a-b8ef-4cdc-a4ce-93287d730311","Type":"ContainerStarted","Data":"d46d00f96eb1be0391afb0324f1796b9d4b658dd46723bd128b3e4ed843344b6"} Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.758421 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pt4tz" podStartSLOduration=2.758403654 podStartE2EDuration="2.758403654s" podCreationTimestamp="2026-03-14 08:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:48.750674055 +0000 UTC m=+1443.999125692" watchObservedRunningTime="2026-03-14 08:51:48.758403654 +0000 UTC m=+1444.006855281" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.807349 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.807423 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-scripts\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.807462 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-config-data\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.807503 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q22g\" (UniqueName: \"kubernetes.io/projected/0453f23e-955b-4cb7-8f57-285144677bc7-kube-api-access-2q22g\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.908915 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.908997 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-scripts\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.909036 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-config-data\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.909109 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q22g\" (UniqueName: \"kubernetes.io/projected/0453f23e-955b-4cb7-8f57-285144677bc7-kube-api-access-2q22g\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.917642 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.918112 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-config-data\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.919662 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-scripts\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:48 crc kubenswrapper[4886]: I0314 08:51:48.949883 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q22g\" (UniqueName: \"kubernetes.io/projected/0453f23e-955b-4cb7-8f57-285144677bc7-kube-api-access-2q22g\") pod \"nova-cell1-conductor-db-sync-ggzbn\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:49 crc kubenswrapper[4886]: I0314 08:51:49.018412 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:51:49 crc kubenswrapper[4886]: I0314 08:51:49.511953 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggzbn"] Mar 14 08:51:49 crc kubenswrapper[4886]: I0314 08:51:49.751198 4886 generic.go:334] "Generic (PLEG): container finished" podID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerID="dd06c67843b49a59a73609ae74916eb6ceab2503a9c018c6169b21b452de7303" exitCode=0 Mar 14 08:51:49 crc kubenswrapper[4886]: I0314 08:51:49.751527 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" event={"ID":"9fa7a09a-b8ef-4cdc-a4ce-93287d730311","Type":"ContainerDied","Data":"dd06c67843b49a59a73609ae74916eb6ceab2503a9c018c6169b21b452de7303"} Mar 14 08:51:49 crc kubenswrapper[4886]: I0314 08:51:49.760247 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" event={"ID":"0453f23e-955b-4cb7-8f57-285144677bc7","Type":"ContainerStarted","Data":"d486b5e38d7bdb2e2511e9e483fbb4cc13f693d0b87a9e43c033c303ef5053d2"} Mar 14 08:51:49 crc kubenswrapper[4886]: I0314 08:51:49.760296 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" event={"ID":"0453f23e-955b-4cb7-8f57-285144677bc7","Type":"ContainerStarted","Data":"5dedfdf577defec319a0ed67521c1383dfd36b02e381cd721d63ce3cf003a5d2"} Mar 14 08:51:50 crc kubenswrapper[4886]: I0314 08:51:50.793398 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" event={"ID":"9fa7a09a-b8ef-4cdc-a4ce-93287d730311","Type":"ContainerStarted","Data":"3f72fd49cfb9a8051e9104eca68870c12e343f61115936ad9291981418484637"} Mar 14 08:51:50 crc kubenswrapper[4886]: I0314 08:51:50.793732 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:50 crc kubenswrapper[4886]: I0314 08:51:50.811890 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" podStartSLOduration=2.811874393 podStartE2EDuration="2.811874393s" podCreationTimestamp="2026-03-14 08:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:49.799573563 +0000 UTC m=+1445.048025200" watchObservedRunningTime="2026-03-14 08:51:50.811874393 +0000 UTC m=+1446.060326030" Mar 14 08:51:50 crc kubenswrapper[4886]: I0314 08:51:50.822708 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" podStartSLOduration=3.8226888199999998 podStartE2EDuration="3.82268882s" podCreationTimestamp="2026-03-14 08:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:50.81105148 +0000 UTC m=+1446.059503117" watchObservedRunningTime="2026-03-14 08:51:50.82268882 +0000 UTC m=+1446.071140457" Mar 14 08:51:51 crc kubenswrapper[4886]: I0314 08:51:51.381400 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:51:51 crc kubenswrapper[4886]: I0314 08:51:51.400425 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.821881 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7aeedb7c-42c2-4739-a5df-01d4e7be5499","Type":"ContainerStarted","Data":"b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7"} Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.823704 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b76b6262-2fc6-46a4-abeb-5a380338b6f6","Type":"ContainerStarted","Data":"e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f"} Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.823841 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b76b6262-2fc6-46a4-abeb-5a380338b6f6" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f" gracePeriod=30 Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.829372 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"629d9381-9a9a-4320-9175-c3bcfb2e509b","Type":"ContainerStarted","Data":"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686"} Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.831954 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07","Type":"ContainerStarted","Data":"4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4"} Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.850752 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.06577934 podStartE2EDuration="6.850732276s" podCreationTimestamp="2026-03-14 08:51:47 +0000 UTC" firstStartedPulling="2026-03-14 08:51:48.448929178 +0000 UTC m=+1443.697380815" lastFinishedPulling="2026-03-14 08:51:53.233882114 +0000 UTC m=+1448.482333751" observedRunningTime="2026-03-14 08:51:53.83922386 +0000 UTC m=+1449.087675497" watchObservedRunningTime="2026-03-14 08:51:53.850732276 +0000 UTC m=+1449.099183913" Mar 14 08:51:53 crc kubenswrapper[4886]: I0314 08:51:53.859230 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.997485142 podStartE2EDuration="7.859214617s" podCreationTimestamp="2026-03-14 08:51:46 +0000 UTC" firstStartedPulling="2026-03-14 08:51:48.375170744 +0000 UTC m=+1443.623622381" lastFinishedPulling="2026-03-14 08:51:53.236900219 +0000 UTC m=+1448.485351856" observedRunningTime="2026-03-14 08:51:53.858514697 +0000 UTC m=+1449.106966334" watchObservedRunningTime="2026-03-14 08:51:53.859214617 +0000 UTC m=+1449.107666254" Mar 14 08:51:54 crc kubenswrapper[4886]: I0314 08:51:54.846409 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"629d9381-9a9a-4320-9175-c3bcfb2e509b","Type":"ContainerStarted","Data":"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8"} Mar 14 08:51:54 crc kubenswrapper[4886]: I0314 08:51:54.846906 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-log" containerID="cri-o://3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686" gracePeriod=30 Mar 14 08:51:54 crc kubenswrapper[4886]: I0314 08:51:54.847801 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-metadata" containerID="cri-o://d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8" gracePeriod=30 Mar 14 08:51:54 crc kubenswrapper[4886]: I0314 08:51:54.855296 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7aeedb7c-42c2-4739-a5df-01d4e7be5499","Type":"ContainerStarted","Data":"7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404"} Mar 14 08:51:54 crc kubenswrapper[4886]: I0314 08:51:54.885926 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.950255041 podStartE2EDuration="8.885904235s" podCreationTimestamp="2026-03-14 08:51:46 +0000 UTC" firstStartedPulling="2026-03-14 08:51:48.295737729 +0000 UTC m=+1443.544189366" lastFinishedPulling="2026-03-14 08:51:53.231386923 +0000 UTC m=+1448.479838560" observedRunningTime="2026-03-14 08:51:54.871701102 +0000 UTC m=+1450.120152739" watchObservedRunningTime="2026-03-14 08:51:54.885904235 +0000 UTC m=+1450.134355872" Mar 14 08:51:54 crc kubenswrapper[4886]: I0314 08:51:54.903186 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.267270581 podStartE2EDuration="8.903162665s" podCreationTimestamp="2026-03-14 08:51:46 +0000 UTC" firstStartedPulling="2026-03-14 08:51:48.595498189 +0000 UTC m=+1443.843949826" lastFinishedPulling="2026-03-14 08:51:53.231390273 +0000 UTC m=+1448.479841910" observedRunningTime="2026-03-14 08:51:54.894782927 +0000 UTC m=+1450.143234565" watchObservedRunningTime="2026-03-14 08:51:54.903162665 +0000 UTC m=+1450.151614302" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.514523 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.661979 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/629d9381-9a9a-4320-9175-c3bcfb2e509b-logs\") pod \"629d9381-9a9a-4320-9175-c3bcfb2e509b\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.662277 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/629d9381-9a9a-4320-9175-c3bcfb2e509b-logs" (OuterVolumeSpecName: "logs") pod "629d9381-9a9a-4320-9175-c3bcfb2e509b" (UID: "629d9381-9a9a-4320-9175-c3bcfb2e509b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.662472 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhzt9\" (UniqueName: \"kubernetes.io/projected/629d9381-9a9a-4320-9175-c3bcfb2e509b-kube-api-access-rhzt9\") pod \"629d9381-9a9a-4320-9175-c3bcfb2e509b\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.662691 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-combined-ca-bundle\") pod \"629d9381-9a9a-4320-9175-c3bcfb2e509b\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.662792 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-config-data\") pod \"629d9381-9a9a-4320-9175-c3bcfb2e509b\" (UID: \"629d9381-9a9a-4320-9175-c3bcfb2e509b\") " Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.663870 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/629d9381-9a9a-4320-9175-c3bcfb2e509b-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.669017 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629d9381-9a9a-4320-9175-c3bcfb2e509b-kube-api-access-rhzt9" (OuterVolumeSpecName: "kube-api-access-rhzt9") pod "629d9381-9a9a-4320-9175-c3bcfb2e509b" (UID: "629d9381-9a9a-4320-9175-c3bcfb2e509b"). InnerVolumeSpecName "kube-api-access-rhzt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.691220 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "629d9381-9a9a-4320-9175-c3bcfb2e509b" (UID: "629d9381-9a9a-4320-9175-c3bcfb2e509b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.698131 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-config-data" (OuterVolumeSpecName: "config-data") pod "629d9381-9a9a-4320-9175-c3bcfb2e509b" (UID: "629d9381-9a9a-4320-9175-c3bcfb2e509b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.765895 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.765935 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629d9381-9a9a-4320-9175-c3bcfb2e509b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.765944 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhzt9\" (UniqueName: \"kubernetes.io/projected/629d9381-9a9a-4320-9175-c3bcfb2e509b-kube-api-access-rhzt9\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867091 4886 generic.go:334] "Generic (PLEG): container finished" podID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerID="d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8" exitCode=0 Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867145 4886 generic.go:334] "Generic (PLEG): container finished" podID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerID="3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686" exitCode=143 Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867155 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867169 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"629d9381-9a9a-4320-9175-c3bcfb2e509b","Type":"ContainerDied","Data":"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8"} Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867234 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"629d9381-9a9a-4320-9175-c3bcfb2e509b","Type":"ContainerDied","Data":"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686"} Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867251 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"629d9381-9a9a-4320-9175-c3bcfb2e509b","Type":"ContainerDied","Data":"a1672a7660a5b7f9847f3919c7e7a1fcf1bfa7fdec9d4628a5336233d0336d86"} Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.867273 4886 scope.go:117] "RemoveContainer" containerID="d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.904441 4886 scope.go:117] "RemoveContainer" containerID="3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.930182 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.936218 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.938621 4886 scope.go:117] "RemoveContainer" containerID="d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8" Mar 14 08:51:55 crc kubenswrapper[4886]: E0314 08:51:55.939059 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8\": container with ID starting with d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8 not found: ID does not exist" containerID="d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939089 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8"} err="failed to get container status \"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8\": rpc error: code = NotFound desc = could not find container \"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8\": container with ID starting with d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8 not found: ID does not exist" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939109 4886 scope.go:117] "RemoveContainer" containerID="3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686" Mar 14 08:51:55 crc kubenswrapper[4886]: E0314 08:51:55.939464 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686\": container with ID starting with 3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686 not found: ID does not exist" containerID="3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939487 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686"} err="failed to get container status \"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686\": rpc error: code = NotFound desc = could not find container \"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686\": container with ID starting with 3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686 not found: ID does not exist" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939500 4886 scope.go:117] "RemoveContainer" containerID="d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939687 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8"} err="failed to get container status \"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8\": rpc error: code = NotFound desc = could not find container \"d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8\": container with ID starting with d540f813cd0c388ef63698da2295388b54157ef7c75e2e5e2881706929a954f8 not found: ID does not exist" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939709 4886 scope.go:117] "RemoveContainer" containerID="3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.939876 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686"} err="failed to get container status \"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686\": rpc error: code = NotFound desc = could not find container \"3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686\": container with ID starting with 3e708d9759650cbe118c44e5918460cf0b303d744c13d363285f31cb4933a686 not found: ID does not exist" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.959727 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:55 crc kubenswrapper[4886]: E0314 08:51:55.960234 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-metadata" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.960251 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-metadata" Mar 14 08:51:55 crc kubenswrapper[4886]: E0314 08:51:55.960285 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-log" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.960292 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-log" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.960470 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-metadata" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.960492 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" containerName="nova-metadata-log" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.961455 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.964203 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 08:51:55 crc kubenswrapper[4886]: I0314 08:51:55.965067 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.041484 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.072622 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.072760 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.072810 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a5e7203-60df-490c-bcac-26070f44f50e-logs\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.072878 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-config-data\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.072900 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6v49\" (UniqueName: \"kubernetes.io/projected/9a5e7203-60df-490c-bcac-26070f44f50e-kube-api-access-z6v49\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.174179 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.174243 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.174275 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a5e7203-60df-490c-bcac-26070f44f50e-logs\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.174329 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-config-data\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.174347 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6v49\" (UniqueName: \"kubernetes.io/projected/9a5e7203-60df-490c-bcac-26070f44f50e-kube-api-access-z6v49\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.175245 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a5e7203-60df-490c-bcac-26070f44f50e-logs\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.180719 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.180836 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-config-data\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.181003 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.194662 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6v49\" (UniqueName: \"kubernetes.io/projected/9a5e7203-60df-490c-bcac-26070f44f50e-kube-api-access-z6v49\") pod \"nova-metadata-0\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.290520 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.776691 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:56 crc kubenswrapper[4886]: W0314 08:51:56.776813 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a5e7203_60df_490c_bcac_26070f44f50e.slice/crio-b331d61a4cf43c4f2e53f4f4ab128e3783f8831f6285f49c7330588ec4c60446 WatchSource:0}: Error finding container b331d61a4cf43c4f2e53f4f4ab128e3783f8831f6285f49c7330588ec4c60446: Status 404 returned error can't find the container with id b331d61a4cf43c4f2e53f4f4ab128e3783f8831f6285f49c7330588ec4c60446 Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.879846 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a5e7203-60df-490c-bcac-26070f44f50e","Type":"ContainerStarted","Data":"b331d61a4cf43c4f2e53f4f4ab128e3783f8831f6285f49c7330588ec4c60446"} Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.888375 4886 generic.go:334] "Generic (PLEG): container finished" podID="60e62d5f-645e-4ef7-adb0-bedd550ade7e" containerID="c209e61eecdb4137304724149ddf492fa6c6d038b5cb91f572d1b065a6fde594" exitCode=0 Mar 14 08:51:56 crc kubenswrapper[4886]: I0314 08:51:56.888438 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pt4tz" event={"ID":"60e62d5f-645e-4ef7-adb0-bedd550ade7e","Type":"ContainerDied","Data":"c209e61eecdb4137304724149ddf492fa6c6d038b5cb91f572d1b065a6fde594"} Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.449035 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="629d9381-9a9a-4320-9175-c3bcfb2e509b" path="/var/lib/kubelet/pods/629d9381-9a9a-4320-9175-c3bcfb2e509b/volumes" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.563850 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.563919 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.615889 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.628665 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.725305 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.725342 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.900426 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a5e7203-60df-490c-bcac-26070f44f50e","Type":"ContainerStarted","Data":"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac"} Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.900485 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a5e7203-60df-490c-bcac-26070f44f50e","Type":"ContainerStarted","Data":"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740"} Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.927930 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.927909768 podStartE2EDuration="2.927909768s" podCreationTimestamp="2026-03-14 08:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:51:57.923822572 +0000 UTC m=+1453.172274209" watchObservedRunningTime="2026-03-14 08:51:57.927909768 +0000 UTC m=+1453.176361405" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.944019 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.950234 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:51:57 crc kubenswrapper[4886]: I0314 08:51:57.950385 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.062725 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-477j6"] Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.062962 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" podUID="711130ba-065d-49aa-92cd-f687292f1674" containerName="dnsmasq-dns" containerID="cri-o://5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a" gracePeriod=10 Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.378692 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.527832 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-combined-ca-bundle\") pod \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.527917 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-config-data\") pod \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.528067 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-scripts\") pod \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.528132 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbf55\" (UniqueName: \"kubernetes.io/projected/60e62d5f-645e-4ef7-adb0-bedd550ade7e-kube-api-access-lbf55\") pod \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\" (UID: \"60e62d5f-645e-4ef7-adb0-bedd550ade7e\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.539307 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-scripts" (OuterVolumeSpecName: "scripts") pod "60e62d5f-645e-4ef7-adb0-bedd550ade7e" (UID: "60e62d5f-645e-4ef7-adb0-bedd550ade7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.545604 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e62d5f-645e-4ef7-adb0-bedd550ade7e-kube-api-access-lbf55" (OuterVolumeSpecName: "kube-api-access-lbf55") pod "60e62d5f-645e-4ef7-adb0-bedd550ade7e" (UID: "60e62d5f-645e-4ef7-adb0-bedd550ade7e"). InnerVolumeSpecName "kube-api-access-lbf55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.579737 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60e62d5f-645e-4ef7-adb0-bedd550ade7e" (UID: "60e62d5f-645e-4ef7-adb0-bedd550ade7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.590293 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-config-data" (OuterVolumeSpecName: "config-data") pod "60e62d5f-645e-4ef7-adb0-bedd550ade7e" (UID: "60e62d5f-645e-4ef7-adb0-bedd550ade7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.630789 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.630822 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.630831 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbf55\" (UniqueName: \"kubernetes.io/projected/60e62d5f-645e-4ef7-adb0-bedd550ade7e-kube-api-access-lbf55\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.630842 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e62d5f-645e-4ef7-adb0-bedd550ade7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.770555 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.809028 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.812447 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.931478 4886 generic.go:334] "Generic (PLEG): container finished" podID="711130ba-065d-49aa-92cd-f687292f1674" containerID="5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a" exitCode=0 Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.931550 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" event={"ID":"711130ba-065d-49aa-92cd-f687292f1674","Type":"ContainerDied","Data":"5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a"} Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.931583 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" event={"ID":"711130ba-065d-49aa-92cd-f687292f1674","Type":"ContainerDied","Data":"3f5def5dd1a0b6f20ce330368dddd14621c18100a543e8f7a925fed970b85dee"} Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.931601 4886 scope.go:117] "RemoveContainer" containerID="5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.931746 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-477j6" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.937194 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-config\") pod \"711130ba-065d-49aa-92cd-f687292f1674\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.937230 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-nb\") pod \"711130ba-065d-49aa-92cd-f687292f1674\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.937251 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xz55\" (UniqueName: \"kubernetes.io/projected/711130ba-065d-49aa-92cd-f687292f1674-kube-api-access-9xz55\") pod \"711130ba-065d-49aa-92cd-f687292f1674\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.937298 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-svc\") pod \"711130ba-065d-49aa-92cd-f687292f1674\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.937369 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-swift-storage-0\") pod \"711130ba-065d-49aa-92cd-f687292f1674\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.937530 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-sb\") pod \"711130ba-065d-49aa-92cd-f687292f1674\" (UID: \"711130ba-065d-49aa-92cd-f687292f1674\") " Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.954976 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pt4tz" event={"ID":"60e62d5f-645e-4ef7-adb0-bedd550ade7e","Type":"ContainerDied","Data":"f51ebb453ff5def349e24be7aff52e06d01670b6e3d660e7adef87d7dca5599e"} Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.955022 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f51ebb453ff5def349e24be7aff52e06d01670b6e3d660e7adef87d7dca5599e" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.955090 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pt4tz" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.967888 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711130ba-065d-49aa-92cd-f687292f1674-kube-api-access-9xz55" (OuterVolumeSpecName: "kube-api-access-9xz55") pod "711130ba-065d-49aa-92cd-f687292f1674" (UID: "711130ba-065d-49aa-92cd-f687292f1674"). InnerVolumeSpecName "kube-api-access-9xz55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.967932 4886 generic.go:334] "Generic (PLEG): container finished" podID="0453f23e-955b-4cb7-8f57-285144677bc7" containerID="d486b5e38d7bdb2e2511e9e483fbb4cc13f693d0b87a9e43c033c303ef5053d2" exitCode=0 Mar 14 08:51:58 crc kubenswrapper[4886]: I0314 08:51:58.967976 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" event={"ID":"0453f23e-955b-4cb7-8f57-285144677bc7","Type":"ContainerDied","Data":"d486b5e38d7bdb2e2511e9e483fbb4cc13f693d0b87a9e43c033c303ef5053d2"} Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.024701 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "711130ba-065d-49aa-92cd-f687292f1674" (UID: "711130ba-065d-49aa-92cd-f687292f1674"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.040626 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xz55\" (UniqueName: \"kubernetes.io/projected/711130ba-065d-49aa-92cd-f687292f1674-kube-api-access-9xz55\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.040653 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.064580 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "711130ba-065d-49aa-92cd-f687292f1674" (UID: "711130ba-065d-49aa-92cd-f687292f1674"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.072849 4886 scope.go:117] "RemoveContainer" containerID="eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.092912 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "711130ba-065d-49aa-92cd-f687292f1674" (UID: "711130ba-065d-49aa-92cd-f687292f1674"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.099461 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.109749 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-config" (OuterVolumeSpecName: "config") pod "711130ba-065d-49aa-92cd-f687292f1674" (UID: "711130ba-065d-49aa-92cd-f687292f1674"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.126327 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.126743 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "711130ba-065d-49aa-92cd-f687292f1674" (UID: "711130ba-065d-49aa-92cd-f687292f1674"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.127157 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-log" containerID="cri-o://b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7" gracePeriod=30 Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.127473 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-api" containerID="cri-o://7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404" gracePeriod=30 Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.135388 4886 scope.go:117] "RemoveContainer" containerID="5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a" Mar 14 08:51:59 crc kubenswrapper[4886]: E0314 08:51:59.135804 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a\": container with ID starting with 5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a not found: ID does not exist" containerID="5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.135834 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a"} err="failed to get container status \"5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a\": rpc error: code = NotFound desc = could not find container \"5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a\": container with ID starting with 5b06985dc1e48ba1f2f263f41b90df4b6f2b22d82ca1bbec83afddbc5e424d2a not found: ID does not exist" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.135854 4886 scope.go:117] "RemoveContainer" containerID="eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8" Mar 14 08:51:59 crc kubenswrapper[4886]: E0314 08:51:59.136216 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8\": container with ID starting with eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8 not found: ID does not exist" containerID="eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.136251 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8"} err="failed to get container status \"eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8\": rpc error: code = NotFound desc = could not find container \"eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8\": container with ID starting with eff5d350e42bd30e64d20a4da3dc344260b52b9e8f8d8fb3c6c68d8c16fb31e8 not found: ID does not exist" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.139527 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.142917 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.142940 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.142950 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.142960 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711130ba-065d-49aa-92cd-f687292f1674-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.272094 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-477j6"] Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.280805 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-477j6"] Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.434373 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711130ba-065d-49aa-92cd-f687292f1674" path="/var/lib/kubelet/pods/711130ba-065d-49aa-92cd-f687292f1674/volumes" Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.978839 4886 generic.go:334] "Generic (PLEG): container finished" podID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerID="b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7" exitCode=143 Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.978936 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7aeedb7c-42c2-4739-a5df-01d4e7be5499","Type":"ContainerDied","Data":"b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7"} Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.980982 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" containerName="nova-scheduler-scheduler" containerID="cri-o://4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4" gracePeriod=30 Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.981079 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-log" containerID="cri-o://77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740" gracePeriod=30 Mar 14 08:51:59 crc kubenswrapper[4886]: I0314 08:51:59.981138 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-metadata" containerID="cri-o://a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac" gracePeriod=30 Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.157214 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557972-zfqj5"] Mar 14 08:52:00 crc kubenswrapper[4886]: E0314 08:52:00.157635 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e62d5f-645e-4ef7-adb0-bedd550ade7e" containerName="nova-manage" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.157648 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e62d5f-645e-4ef7-adb0-bedd550ade7e" containerName="nova-manage" Mar 14 08:52:00 crc kubenswrapper[4886]: E0314 08:52:00.157664 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711130ba-065d-49aa-92cd-f687292f1674" containerName="dnsmasq-dns" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.157670 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="711130ba-065d-49aa-92cd-f687292f1674" containerName="dnsmasq-dns" Mar 14 08:52:00 crc kubenswrapper[4886]: E0314 08:52:00.157698 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711130ba-065d-49aa-92cd-f687292f1674" containerName="init" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.157705 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="711130ba-065d-49aa-92cd-f687292f1674" containerName="init" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.157896 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="711130ba-065d-49aa-92cd-f687292f1674" containerName="dnsmasq-dns" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.157909 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e62d5f-645e-4ef7-adb0-bedd550ade7e" containerName="nova-manage" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.158581 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.162430 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.163605 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.163738 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.173202 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-zfqj5"] Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.263501 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx969\" (UniqueName: \"kubernetes.io/projected/ed23514f-ddc6-4359-ada9-147ca9d19bf9-kube-api-access-tx969\") pod \"auto-csr-approver-29557972-zfqj5\" (UID: \"ed23514f-ddc6-4359-ada9-147ca9d19bf9\") " pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.365975 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx969\" (UniqueName: \"kubernetes.io/projected/ed23514f-ddc6-4359-ada9-147ca9d19bf9-kube-api-access-tx969\") pod \"auto-csr-approver-29557972-zfqj5\" (UID: \"ed23514f-ddc6-4359-ada9-147ca9d19bf9\") " pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.394859 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx969\" (UniqueName: \"kubernetes.io/projected/ed23514f-ddc6-4359-ada9-147ca9d19bf9-kube-api-access-tx969\") pod \"auto-csr-approver-29557972-zfqj5\" (UID: \"ed23514f-ddc6-4359-ada9-147ca9d19bf9\") " pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.428249 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.537190 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.570677 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-combined-ca-bundle\") pod \"0453f23e-955b-4cb7-8f57-285144677bc7\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.570743 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-config-data\") pod \"0453f23e-955b-4cb7-8f57-285144677bc7\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.570808 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-scripts\") pod \"0453f23e-955b-4cb7-8f57-285144677bc7\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.571006 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q22g\" (UniqueName: \"kubernetes.io/projected/0453f23e-955b-4cb7-8f57-285144677bc7-kube-api-access-2q22g\") pod \"0453f23e-955b-4cb7-8f57-285144677bc7\" (UID: \"0453f23e-955b-4cb7-8f57-285144677bc7\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.576492 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0453f23e-955b-4cb7-8f57-285144677bc7-kube-api-access-2q22g" (OuterVolumeSpecName: "kube-api-access-2q22g") pod "0453f23e-955b-4cb7-8f57-285144677bc7" (UID: "0453f23e-955b-4cb7-8f57-285144677bc7"). InnerVolumeSpecName "kube-api-access-2q22g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.589018 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-scripts" (OuterVolumeSpecName: "scripts") pod "0453f23e-955b-4cb7-8f57-285144677bc7" (UID: "0453f23e-955b-4cb7-8f57-285144677bc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.602098 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.620792 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-config-data" (OuterVolumeSpecName: "config-data") pod "0453f23e-955b-4cb7-8f57-285144677bc7" (UID: "0453f23e-955b-4cb7-8f57-285144677bc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.622472 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0453f23e-955b-4cb7-8f57-285144677bc7" (UID: "0453f23e-955b-4cb7-8f57-285144677bc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.675810 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6v49\" (UniqueName: \"kubernetes.io/projected/9a5e7203-60df-490c-bcac-26070f44f50e-kube-api-access-z6v49\") pod \"9a5e7203-60df-490c-bcac-26070f44f50e\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.675870 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-config-data\") pod \"9a5e7203-60df-490c-bcac-26070f44f50e\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.675928 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a5e7203-60df-490c-bcac-26070f44f50e-logs\") pod \"9a5e7203-60df-490c-bcac-26070f44f50e\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.675993 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-nova-metadata-tls-certs\") pod \"9a5e7203-60df-490c-bcac-26070f44f50e\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676329 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-combined-ca-bundle\") pod \"9a5e7203-60df-490c-bcac-26070f44f50e\" (UID: \"9a5e7203-60df-490c-bcac-26070f44f50e\") " Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676580 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a5e7203-60df-490c-bcac-26070f44f50e-logs" (OuterVolumeSpecName: "logs") pod "9a5e7203-60df-490c-bcac-26070f44f50e" (UID: "9a5e7203-60df-490c-bcac-26070f44f50e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676859 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a5e7203-60df-490c-bcac-26070f44f50e-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676879 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q22g\" (UniqueName: \"kubernetes.io/projected/0453f23e-955b-4cb7-8f57-285144677bc7-kube-api-access-2q22g\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676890 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676900 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.676908 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0453f23e-955b-4cb7-8f57-285144677bc7-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.708628 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5e7203-60df-490c-bcac-26070f44f50e-kube-api-access-z6v49" (OuterVolumeSpecName: "kube-api-access-z6v49") pod "9a5e7203-60df-490c-bcac-26070f44f50e" (UID: "9a5e7203-60df-490c-bcac-26070f44f50e"). InnerVolumeSpecName "kube-api-access-z6v49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.710428 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a5e7203-60df-490c-bcac-26070f44f50e" (UID: "9a5e7203-60df-490c-bcac-26070f44f50e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.741978 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-config-data" (OuterVolumeSpecName: "config-data") pod "9a5e7203-60df-490c-bcac-26070f44f50e" (UID: "9a5e7203-60df-490c-bcac-26070f44f50e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.767302 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9a5e7203-60df-490c-bcac-26070f44f50e" (UID: "9a5e7203-60df-490c-bcac-26070f44f50e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.780159 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.780196 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6v49\" (UniqueName: \"kubernetes.io/projected/9a5e7203-60df-490c-bcac-26070f44f50e-kube-api-access-z6v49\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.780208 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.780217 4886 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a5e7203-60df-490c-bcac-26070f44f50e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.990965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" event={"ID":"0453f23e-955b-4cb7-8f57-285144677bc7","Type":"ContainerDied","Data":"5dedfdf577defec319a0ed67521c1383dfd36b02e381cd721d63ce3cf003a5d2"} Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.991010 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dedfdf577defec319a0ed67521c1383dfd36b02e381cd721d63ce3cf003a5d2" Mar 14 08:52:00 crc kubenswrapper[4886]: I0314 08:52:00.991061 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ggzbn" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003753 4886 generic.go:334] "Generic (PLEG): container finished" podID="9a5e7203-60df-490c-bcac-26070f44f50e" containerID="a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac" exitCode=0 Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003781 4886 generic.go:334] "Generic (PLEG): container finished" podID="9a5e7203-60df-490c-bcac-26070f44f50e" containerID="77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740" exitCode=143 Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003802 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a5e7203-60df-490c-bcac-26070f44f50e","Type":"ContainerDied","Data":"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac"} Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003830 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a5e7203-60df-490c-bcac-26070f44f50e","Type":"ContainerDied","Data":"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740"} Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003843 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a5e7203-60df-490c-bcac-26070f44f50e","Type":"ContainerDied","Data":"b331d61a4cf43c4f2e53f4f4ab128e3783f8831f6285f49c7330588ec4c60446"} Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003859 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.003859 4886 scope.go:117] "RemoveContainer" containerID="a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.044402 4886 scope.go:117] "RemoveContainer" containerID="77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.052837 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.070242 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.082267 4886 scope.go:117] "RemoveContainer" containerID="a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac" Mar 14 08:52:01 crc kubenswrapper[4886]: E0314 08:52:01.087778 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac\": container with ID starting with a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac not found: ID does not exist" containerID="a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.087827 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac"} err="failed to get container status \"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac\": rpc error: code = NotFound desc = could not find container \"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac\": container with ID starting with a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac not found: ID does not exist" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.087858 4886 scope.go:117] "RemoveContainer" containerID="77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740" Mar 14 08:52:01 crc kubenswrapper[4886]: E0314 08:52:01.088354 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740\": container with ID starting with 77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740 not found: ID does not exist" containerID="77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.088377 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740"} err="failed to get container status \"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740\": rpc error: code = NotFound desc = could not find container \"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740\": container with ID starting with 77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740 not found: ID does not exist" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.088389 4886 scope.go:117] "RemoveContainer" containerID="a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.088665 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac"} err="failed to get container status \"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac\": rpc error: code = NotFound desc = could not find container \"a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac\": container with ID starting with a7ccb3cd8f3d0191cbf7da1f3fe27a70a82a8e9099c4544b0bef73574a0a65ac not found: ID does not exist" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.088682 4886 scope.go:117] "RemoveContainer" containerID="77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.088882 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740"} err="failed to get container status \"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740\": rpc error: code = NotFound desc = could not find container \"77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740\": container with ID starting with 77d383923129bd0e02843386543f37c2894b48502381d4b2d99e61c03cc9d740 not found: ID does not exist" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.096915 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:01 crc kubenswrapper[4886]: E0314 08:52:01.097471 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-log" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.097491 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-log" Mar 14 08:52:01 crc kubenswrapper[4886]: E0314 08:52:01.097520 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-metadata" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.097527 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-metadata" Mar 14 08:52:01 crc kubenswrapper[4886]: E0314 08:52:01.097548 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0453f23e-955b-4cb7-8f57-285144677bc7" containerName="nova-cell1-conductor-db-sync" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.097558 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0453f23e-955b-4cb7-8f57-285144677bc7" containerName="nova-cell1-conductor-db-sync" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.097788 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-metadata" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.097813 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0453f23e-955b-4cb7-8f57-285144677bc7" containerName="nova-cell1-conductor-db-sync" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.097832 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" containerName="nova-metadata-log" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.099055 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.099167 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.106081 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.114589 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.131362 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.134685 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.137479 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.162331 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.171629 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-zfqj5"] Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.197953 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198000 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a601c8-c71a-41d8-b877-75ef0e2ac892-logs\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198061 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9zp\" (UniqueName: \"kubernetes.io/projected/28887cc9-3fde-46ad-b166-92d820ad7689-kube-api-access-kv9zp\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198081 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-config-data\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198100 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28887cc9-3fde-46ad-b166-92d820ad7689-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198129 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198161 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28887cc9-3fde-46ad-b166-92d820ad7689-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.198206 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl55j\" (UniqueName: \"kubernetes.io/projected/43a601c8-c71a-41d8-b877-75ef0e2ac892-kube-api-access-bl55j\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.299941 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9zp\" (UniqueName: \"kubernetes.io/projected/28887cc9-3fde-46ad-b166-92d820ad7689-kube-api-access-kv9zp\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.299993 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-config-data\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300017 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28887cc9-3fde-46ad-b166-92d820ad7689-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300037 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300068 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28887cc9-3fde-46ad-b166-92d820ad7689-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300128 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl55j\" (UniqueName: \"kubernetes.io/projected/43a601c8-c71a-41d8-b877-75ef0e2ac892-kube-api-access-bl55j\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300181 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300212 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a601c8-c71a-41d8-b877-75ef0e2ac892-logs\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.300561 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a601c8-c71a-41d8-b877-75ef0e2ac892-logs\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.306690 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.306689 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-config-data\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.306879 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28887cc9-3fde-46ad-b166-92d820ad7689-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.307709 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.311099 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28887cc9-3fde-46ad-b166-92d820ad7689-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.316045 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9zp\" (UniqueName: \"kubernetes.io/projected/28887cc9-3fde-46ad-b166-92d820ad7689-kube-api-access-kv9zp\") pod \"nova-cell1-conductor-0\" (UID: \"28887cc9-3fde-46ad-b166-92d820ad7689\") " pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.318183 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl55j\" (UniqueName: \"kubernetes.io/projected/43a601c8-c71a-41d8-b877-75ef0e2ac892-kube-api-access-bl55j\") pod \"nova-metadata-0\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.431389 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5e7203-60df-490c-bcac-26070f44f50e" path="/var/lib/kubelet/pods/9a5e7203-60df-490c-bcac-26070f44f50e/volumes" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.432880 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:01 crc kubenswrapper[4886]: I0314 08:52:01.456846 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.013471 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" event={"ID":"ed23514f-ddc6-4359-ada9-147ca9d19bf9","Type":"ContainerStarted","Data":"d3d14a0694b60d2a1ce96d10bb82ad7ec7f65818045145aab7c536006e81674b"} Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.015002 4886 generic.go:334] "Generic (PLEG): container finished" podID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" containerID="4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4" exitCode=0 Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.015054 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07","Type":"ContainerDied","Data":"4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4"} Mar 14 08:52:02 crc kubenswrapper[4886]: E0314 08:52:02.559148 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4 is running failed: container process not found" containerID="4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 08:52:02 crc kubenswrapper[4886]: E0314 08:52:02.561377 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4 is running failed: container process not found" containerID="4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 08:52:02 crc kubenswrapper[4886]: E0314 08:52:02.562858 4886 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4 is running failed: container process not found" containerID="4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 14 08:52:02 crc kubenswrapper[4886]: E0314 08:52:02.562898 4886 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" containerName="nova-scheduler-scheduler" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.752150 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:02 crc kubenswrapper[4886]: W0314 08:52:02.854716 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43a601c8_c71a_41d8_b877_75ef0e2ac892.slice/crio-e49bd7f35982f0381fa4937d05d7d3cb0bdf0ff3040262385767f29074333c24 WatchSource:0}: Error finding container e49bd7f35982f0381fa4937d05d7d3cb0bdf0ff3040262385767f29074333c24: Status 404 returned error can't find the container with id e49bd7f35982f0381fa4937d05d7d3cb0bdf0ff3040262385767f29074333c24 Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.869260 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxkqb\" (UniqueName: \"kubernetes.io/projected/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-kube-api-access-lxkqb\") pod \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.869493 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-config-data\") pod \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.869750 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-combined-ca-bundle\") pod \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\" (UID: \"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07\") " Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.875966 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-kube-api-access-lxkqb" (OuterVolumeSpecName: "kube-api-access-lxkqb") pod "cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" (UID: "cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07"). InnerVolumeSpecName "kube-api-access-lxkqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.877213 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.900585 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.906614 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" (UID: "cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.936028 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-config-data" (OuterVolumeSpecName: "config-data") pod "cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" (UID: "cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.973818 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.974049 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxkqb\" (UniqueName: \"kubernetes.io/projected/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-kube-api-access-lxkqb\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:02 crc kubenswrapper[4886]: I0314 08:52:02.974139 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.029565 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43a601c8-c71a-41d8-b877-75ef0e2ac892","Type":"ContainerStarted","Data":"e49bd7f35982f0381fa4937d05d7d3cb0bdf0ff3040262385767f29074333c24"} Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.031752 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"28887cc9-3fde-46ad-b166-92d820ad7689","Type":"ContainerStarted","Data":"84e06d0c15eb8f8c68bf11e116e8706253cbd7a106cac105e3e320d381ecfabc"} Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.034834 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" event={"ID":"ed23514f-ddc6-4359-ada9-147ca9d19bf9","Type":"ContainerStarted","Data":"e0167f4101121823b1f0f5c81ed6c66a3164b73927b29003799d7f631c6248be"} Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.037573 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07","Type":"ContainerDied","Data":"b82582ca96288593e337a004c597d3281346ceab1c6bb1a173413fdca79c1b1b"} Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.037736 4886 scope.go:117] "RemoveContainer" containerID="4f6badc78a11864b36c1cf9518d65c8a2c53ffcf8bfda55a44e5321231a687d4" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.037625 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.125537 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.136838 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.149495 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:03 crc kubenswrapper[4886]: E0314 08:52:03.149941 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" containerName="nova-scheduler-scheduler" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.149957 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" containerName="nova-scheduler-scheduler" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.152373 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" containerName="nova-scheduler-scheduler" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.153073 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.154525 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.165546 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.281372 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2brj\" (UniqueName: \"kubernetes.io/projected/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-kube-api-access-q2brj\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.281689 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-config-data\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.282024 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.384497 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2brj\" (UniqueName: \"kubernetes.io/projected/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-kube-api-access-q2brj\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.384569 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-config-data\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.384657 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.389874 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.390072 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-config-data\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.390831 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.391045 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9dafb8d0-a7d6-4426-a52d-19408d20b8b3" containerName="kube-state-metrics" containerID="cri-o://e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd" gracePeriod=30 Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.406934 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2brj\" (UniqueName: \"kubernetes.io/projected/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-kube-api-access-q2brj\") pod \"nova-scheduler-0\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.432046 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07" path="/var/lib/kubelet/pods/cd5d9613-2c7d-4e7a-8e8e-a1da5cb89e07/volumes" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.477009 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:03 crc kubenswrapper[4886]: I0314 08:52:03.941943 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.051678 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43a601c8-c71a-41d8-b877-75ef0e2ac892","Type":"ContainerStarted","Data":"2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e"} Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.051729 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43a601c8-c71a-41d8-b877-75ef0e2ac892","Type":"ContainerStarted","Data":"a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f"} Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.056661 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"28887cc9-3fde-46ad-b166-92d820ad7689","Type":"ContainerStarted","Data":"8c7125fd67f59ae35bf922ec4e5ef1f4c910afb7a8052ef4bb0b2353d36ab914"} Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.057618 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.063083 4886 generic.go:334] "Generic (PLEG): container finished" podID="ed23514f-ddc6-4359-ada9-147ca9d19bf9" containerID="e0167f4101121823b1f0f5c81ed6c66a3164b73927b29003799d7f631c6248be" exitCode=0 Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.063246 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" event={"ID":"ed23514f-ddc6-4359-ada9-147ca9d19bf9","Type":"ContainerDied","Data":"e0167f4101121823b1f0f5c81ed6c66a3164b73927b29003799d7f631c6248be"} Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.067565 4886 generic.go:334] "Generic (PLEG): container finished" podID="9dafb8d0-a7d6-4426-a52d-19408d20b8b3" containerID="e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd" exitCode=2 Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.067612 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dafb8d0-a7d6-4426-a52d-19408d20b8b3","Type":"ContainerDied","Data":"e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd"} Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.067654 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dafb8d0-a7d6-4426-a52d-19408d20b8b3","Type":"ContainerDied","Data":"ba0eefbc0f806efb0d8b92d434cfea8e1e12f14165379848bb0594a2a72868e6"} Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.067674 4886 scope.go:117] "RemoveContainer" containerID="e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.067849 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.092190 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.092166212 podStartE2EDuration="3.092166212s" podCreationTimestamp="2026-03-14 08:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:04.072634948 +0000 UTC m=+1459.321086605" watchObservedRunningTime="2026-03-14 08:52:04.092166212 +0000 UTC m=+1459.340617839" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.102638 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjqg\" (UniqueName: \"kubernetes.io/projected/9dafb8d0-a7d6-4426-a52d-19408d20b8b3-kube-api-access-kjjqg\") pod \"9dafb8d0-a7d6-4426-a52d-19408d20b8b3\" (UID: \"9dafb8d0-a7d6-4426-a52d-19408d20b8b3\") " Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.113269 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dafb8d0-a7d6-4426-a52d-19408d20b8b3-kube-api-access-kjjqg" (OuterVolumeSpecName: "kube-api-access-kjjqg") pod "9dafb8d0-a7d6-4426-a52d-19408d20b8b3" (UID: "9dafb8d0-a7d6-4426-a52d-19408d20b8b3"). InnerVolumeSpecName "kube-api-access-kjjqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.113662 4886 scope.go:117] "RemoveContainer" containerID="e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd" Mar 14 08:52:04 crc kubenswrapper[4886]: E0314 08:52:04.114103 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd\": container with ID starting with e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd not found: ID does not exist" containerID="e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.114169 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd"} err="failed to get container status \"e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd\": rpc error: code = NotFound desc = could not find container \"e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd\": container with ID starting with e8e37c5e84bb5d7d7cbdf31e1805acd3a8c8d583fd75a6ea79b611d8fc6fcafd not found: ID does not exist" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.123344 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.123322127 podStartE2EDuration="3.123322127s" podCreationTimestamp="2026-03-14 08:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:04.103516384 +0000 UTC m=+1459.351968011" watchObservedRunningTime="2026-03-14 08:52:04.123322127 +0000 UTC m=+1459.371773764" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.155497 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.212731 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjjqg\" (UniqueName: \"kubernetes.io/projected/9dafb8d0-a7d6-4426-a52d-19408d20b8b3-kube-api-access-kjjqg\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.419175 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.428884 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.440558 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.466334 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:52:04 crc kubenswrapper[4886]: E0314 08:52:04.466851 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed23514f-ddc6-4359-ada9-147ca9d19bf9" containerName="oc" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.466875 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed23514f-ddc6-4359-ada9-147ca9d19bf9" containerName="oc" Mar 14 08:52:04 crc kubenswrapper[4886]: E0314 08:52:04.466905 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dafb8d0-a7d6-4426-a52d-19408d20b8b3" containerName="kube-state-metrics" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.466913 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dafb8d0-a7d6-4426-a52d-19408d20b8b3" containerName="kube-state-metrics" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.467204 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed23514f-ddc6-4359-ada9-147ca9d19bf9" containerName="oc" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.467225 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dafb8d0-a7d6-4426-a52d-19408d20b8b3" containerName="kube-state-metrics" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.468052 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.471701 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.471962 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.477963 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.519858 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx969\" (UniqueName: \"kubernetes.io/projected/ed23514f-ddc6-4359-ada9-147ca9d19bf9-kube-api-access-tx969\") pod \"ed23514f-ddc6-4359-ada9-147ca9d19bf9\" (UID: \"ed23514f-ddc6-4359-ada9-147ca9d19bf9\") " Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.530033 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed23514f-ddc6-4359-ada9-147ca9d19bf9-kube-api-access-tx969" (OuterVolumeSpecName: "kube-api-access-tx969") pod "ed23514f-ddc6-4359-ada9-147ca9d19bf9" (UID: "ed23514f-ddc6-4359-ada9-147ca9d19bf9"). InnerVolumeSpecName "kube-api-access-tx969". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.622419 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.622490 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.622618 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.622654 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hmt\" (UniqueName: \"kubernetes.io/projected/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-api-access-f2hmt\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.622713 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx969\" (UniqueName: \"kubernetes.io/projected/ed23514f-ddc6-4359-ada9-147ca9d19bf9-kube-api-access-tx969\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.724441 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hmt\" (UniqueName: \"kubernetes.io/projected/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-api-access-f2hmt\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.724531 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.724580 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.724674 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.730050 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.732869 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.739745 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.748375 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hmt\" (UniqueName: \"kubernetes.io/projected/9edceed1-9561-4562-bdf8-1c2ff655a920-kube-api-access-f2hmt\") pod \"kube-state-metrics-0\" (UID: \"9edceed1-9561-4562-bdf8-1c2ff655a920\") " pod="openstack/kube-state-metrics-0" Mar 14 08:52:04 crc kubenswrapper[4886]: I0314 08:52:04.789182 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.135796 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.138405 4886 generic.go:334] "Generic (PLEG): container finished" podID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerID="7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404" exitCode=0 Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.138498 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7aeedb7c-42c2-4739-a5df-01d4e7be5499","Type":"ContainerDied","Data":"7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404"} Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.138527 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7aeedb7c-42c2-4739-a5df-01d4e7be5499","Type":"ContainerDied","Data":"9d6ff2e2c3b1840ac5a31accc5ea2299bc042f4cb95fffdefcc5a201bc93a695"} Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.138546 4886 scope.go:117] "RemoveContainer" containerID="7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.180104 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" event={"ID":"ed23514f-ddc6-4359-ada9-147ca9d19bf9","Type":"ContainerDied","Data":"d3d14a0694b60d2a1ce96d10bb82ad7ec7f65818045145aab7c536006e81674b"} Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.180170 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d14a0694b60d2a1ce96d10bb82ad7ec7f65818045145aab7c536006e81674b" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.180275 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557972-zfqj5" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.236303 4886 scope.go:117] "RemoveContainer" containerID="b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.242335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc17f47-abd0-4f59-a626-d40a5a83f9cb","Type":"ContainerStarted","Data":"32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856"} Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.242381 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc17f47-abd0-4f59-a626-d40a5a83f9cb","Type":"ContainerStarted","Data":"342a598a112428f3a97b824cc1d76935741b30536283a75a907514acdb6175ce"} Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.251011 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-config-data\") pod \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.251080 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sd5d\" (UniqueName: \"kubernetes.io/projected/7aeedb7c-42c2-4739-a5df-01d4e7be5499-kube-api-access-6sd5d\") pod \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.251184 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aeedb7c-42c2-4739-a5df-01d4e7be5499-logs\") pod \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.251398 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-combined-ca-bundle\") pod \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\" (UID: \"7aeedb7c-42c2-4739-a5df-01d4e7be5499\") " Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.259368 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aeedb7c-42c2-4739-a5df-01d4e7be5499-kube-api-access-6sd5d" (OuterVolumeSpecName: "kube-api-access-6sd5d") pod "7aeedb7c-42c2-4739-a5df-01d4e7be5499" (UID: "7aeedb7c-42c2-4739-a5df-01d4e7be5499"). InnerVolumeSpecName "kube-api-access-6sd5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.262816 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aeedb7c-42c2-4739-a5df-01d4e7be5499-logs" (OuterVolumeSpecName: "logs") pod "7aeedb7c-42c2-4739-a5df-01d4e7be5499" (UID: "7aeedb7c-42c2-4739-a5df-01d4e7be5499"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.268688 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.268666243 podStartE2EDuration="2.268666243s" podCreationTimestamp="2026-03-14 08:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:05.264905967 +0000 UTC m=+1460.513357604" watchObservedRunningTime="2026-03-14 08:52:05.268666243 +0000 UTC m=+1460.517117880" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.301588 4886 scope.go:117] "RemoveContainer" containerID="7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404" Mar 14 08:52:05 crc kubenswrapper[4886]: E0314 08:52:05.312224 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404\": container with ID starting with 7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404 not found: ID does not exist" containerID="7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.312277 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404"} err="failed to get container status \"7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404\": rpc error: code = NotFound desc = could not find container \"7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404\": container with ID starting with 7318ad10fcc1a881aaf5fdbfb9b980e95f8e77430a38c82cf5b810ad614ee404 not found: ID does not exist" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.312312 4886 scope.go:117] "RemoveContainer" containerID="b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.312338 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aeedb7c-42c2-4739-a5df-01d4e7be5499" (UID: "7aeedb7c-42c2-4739-a5df-01d4e7be5499"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:05 crc kubenswrapper[4886]: E0314 08:52:05.312913 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7\": container with ID starting with b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7 not found: ID does not exist" containerID="b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.312956 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7"} err="failed to get container status \"b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7\": rpc error: code = NotFound desc = could not find container \"b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7\": container with ID starting with b2454516f33e192c03af33d0a299a6c7dcc4dab5f08ce61b17b8fda8b14661b7 not found: ID does not exist" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.334358 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-config-data" (OuterVolumeSpecName: "config-data") pod "7aeedb7c-42c2-4739-a5df-01d4e7be5499" (UID: "7aeedb7c-42c2-4739-a5df-01d4e7be5499"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.355156 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.355198 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sd5d\" (UniqueName: \"kubernetes.io/projected/7aeedb7c-42c2-4739-a5df-01d4e7be5499-kube-api-access-6sd5d\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.355212 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aeedb7c-42c2-4739-a5df-01d4e7be5499-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.355226 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7aeedb7c-42c2-4739-a5df-01d4e7be5499-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.408548 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.431574 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dafb8d0-a7d6-4426-a52d-19408d20b8b3" path="/var/lib/kubelet/pods/9dafb8d0-a7d6-4426-a52d-19408d20b8b3/volumes" Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.501589 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-729xs"] Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.513390 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557966-729xs"] Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.784618 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.784881 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-central-agent" containerID="cri-o://3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07" gracePeriod=30 Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.785334 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="proxy-httpd" containerID="cri-o://2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0" gracePeriod=30 Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.785389 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="sg-core" containerID="cri-o://d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb" gracePeriod=30 Mar 14 08:52:05 crc kubenswrapper[4886]: I0314 08:52:05.785422 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-notification-agent" containerID="cri-o://0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde" gracePeriod=30 Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.251248 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.254204 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9edceed1-9561-4562-bdf8-1c2ff655a920","Type":"ContainerStarted","Data":"5d1314ef79a532d4caac06962c09e9970b1704008242ab66c36d0d23c359bc9c"} Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.254239 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9edceed1-9561-4562-bdf8-1c2ff655a920","Type":"ContainerStarted","Data":"21b825c209fcb760ddd232d9f01af78448b840074bcd25c23e9e76e712fb1355"} Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.254295 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.257281 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerID="2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0" exitCode=0 Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.257313 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerID="d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb" exitCode=2 Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.257322 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerID="3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07" exitCode=0 Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.257340 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerDied","Data":"2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0"} Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.257363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerDied","Data":"d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb"} Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.257373 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerDied","Data":"3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07"} Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.280506 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.292487 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.302201 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8750117670000002 podStartE2EDuration="2.302182915s" podCreationTimestamp="2026-03-14 08:52:04 +0000 UTC" firstStartedPulling="2026-03-14 08:52:05.413749822 +0000 UTC m=+1460.662201459" lastFinishedPulling="2026-03-14 08:52:05.84092097 +0000 UTC m=+1461.089372607" observedRunningTime="2026-03-14 08:52:06.29708696 +0000 UTC m=+1461.545538597" watchObservedRunningTime="2026-03-14 08:52:06.302182915 +0000 UTC m=+1461.550634552" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.333170 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:06 crc kubenswrapper[4886]: E0314 08:52:06.333577 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-log" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.333597 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-log" Mar 14 08:52:06 crc kubenswrapper[4886]: E0314 08:52:06.333630 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-api" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.333637 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-api" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.333855 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-log" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.333884 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" containerName="nova-api-api" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.334953 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.341221 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.373527 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.477033 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-config-data\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.477179 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnt5h\" (UniqueName: \"kubernetes.io/projected/122fa5d6-e40f-49c2-befb-9ae5e8de5671-kube-api-access-jnt5h\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.477272 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122fa5d6-e40f-49c2-befb-9ae5e8de5671-logs\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.477461 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.579544 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.579711 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-config-data\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.579796 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnt5h\" (UniqueName: \"kubernetes.io/projected/122fa5d6-e40f-49c2-befb-9ae5e8de5671-kube-api-access-jnt5h\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.579975 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122fa5d6-e40f-49c2-befb-9ae5e8de5671-logs\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.580873 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122fa5d6-e40f-49c2-befb-9ae5e8de5671-logs\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.585392 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.586539 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-config-data\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.600247 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnt5h\" (UniqueName: \"kubernetes.io/projected/122fa5d6-e40f-49c2-befb-9ae5e8de5671-kube-api-access-jnt5h\") pod \"nova-api-0\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " pod="openstack/nova-api-0" Mar 14 08:52:06 crc kubenswrapper[4886]: I0314 08:52:06.694419 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.092862 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190316 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7598\" (UniqueName: \"kubernetes.io/projected/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-kube-api-access-c7598\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190457 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-log-httpd\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190505 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-combined-ca-bundle\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190557 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-config-data\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190749 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-scripts\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190787 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-sg-core-conf-yaml\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.190851 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-run-httpd\") pod \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\" (UID: \"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5\") " Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.192251 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.193406 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.197289 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-kube-api-access-c7598" (OuterVolumeSpecName: "kube-api-access-c7598") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "kube-api-access-c7598". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.199252 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-scripts" (OuterVolumeSpecName: "scripts") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.226392 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.255389 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.272863 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerID="0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde" exitCode=0 Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.273791 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerDied","Data":"0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde"} Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.273823 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5","Type":"ContainerDied","Data":"143b81d7d35eaa424f6b2dee6761e9723e96daf551138bccb0cb0ef82399b9f3"} Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.273839 4886 scope.go:117] "RemoveContainer" containerID="2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.272984 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.293850 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.293881 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.293890 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.293928 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7598\" (UniqueName: \"kubernetes.io/projected/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-kube-api-access-c7598\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.293940 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.299520 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.311604 4886 scope.go:117] "RemoveContainer" containerID="d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.332306 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-config-data" (OuterVolumeSpecName: "config-data") pod "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" (UID: "5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.343646 4886 scope.go:117] "RemoveContainer" containerID="0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.362973 4886 scope.go:117] "RemoveContainer" containerID="3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.393951 4886 scope.go:117] "RemoveContainer" containerID="2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.396262 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.396316 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.396796 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0\": container with ID starting with 2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0 not found: ID does not exist" containerID="2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.396836 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0"} err="failed to get container status \"2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0\": rpc error: code = NotFound desc = could not find container \"2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0\": container with ID starting with 2cc4931725a76699f328e5c69c694f1727d6d304426ce9369c6a30215906afa0 not found: ID does not exist" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.396865 4886 scope.go:117] "RemoveContainer" containerID="d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.397531 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb\": container with ID starting with d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb not found: ID does not exist" containerID="d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.397554 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb"} err="failed to get container status \"d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb\": rpc error: code = NotFound desc = could not find container \"d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb\": container with ID starting with d5e852e80afbc200a2389d239a2a0d08d702c59486bffcf499c75a9893843cfb not found: ID does not exist" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.397570 4886 scope.go:117] "RemoveContainer" containerID="0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.397913 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde\": container with ID starting with 0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde not found: ID does not exist" containerID="0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.397939 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde"} err="failed to get container status \"0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde\": rpc error: code = NotFound desc = could not find container \"0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde\": container with ID starting with 0053512c65c6e255221157e939643f497d45e8db0648f4dd464ffb16ace5bdde not found: ID does not exist" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.397954 4886 scope.go:117] "RemoveContainer" containerID="3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.398401 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07\": container with ID starting with 3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07 not found: ID does not exist" containerID="3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.398422 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07"} err="failed to get container status \"3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07\": rpc error: code = NotFound desc = could not find container \"3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07\": container with ID starting with 3d2a1e7fd98baf42771294384d583ba93c20bfa7b05f28d8a841ed79b28e2b07 not found: ID does not exist" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.432941 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12215c1a-53af-4b8b-9b8c-1333f74942ce" path="/var/lib/kubelet/pods/12215c1a-53af-4b8b-9b8c-1333f74942ce/volumes" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.433756 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aeedb7c-42c2-4739-a5df-01d4e7be5499" path="/var/lib/kubelet/pods/7aeedb7c-42c2-4739-a5df-01d4e7be5499/volumes" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.602171 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.618558 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.628468 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.628922 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-notification-agent" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.628939 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-notification-agent" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.628970 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="proxy-httpd" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.628977 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="proxy-httpd" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.628988 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-central-agent" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.628994 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-central-agent" Mar 14 08:52:07 crc kubenswrapper[4886]: E0314 08:52:07.629009 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="sg-core" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.629016 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="sg-core" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.629210 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="sg-core" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.629230 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-central-agent" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.629243 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="proxy-httpd" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.629252 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" containerName="ceilometer-notification-agent" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.630855 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.634509 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.634661 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.634816 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.638532 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803357 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-run-httpd\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803409 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-scripts\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803479 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803569 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-log-httpd\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803614 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-config-data\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803646 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.803682 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghphc\" (UniqueName: \"kubernetes.io/projected/d46d79e4-6108-4b59-8399-8062975ef0f3-kube-api-access-ghphc\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905659 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-run-httpd\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905719 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-scripts\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905766 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905790 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905870 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-log-httpd\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-config-data\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.905973 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghphc\" (UniqueName: \"kubernetes.io/projected/d46d79e4-6108-4b59-8399-8062975ef0f3-kube-api-access-ghphc\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.906813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-run-httpd\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.907638 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-log-httpd\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.910585 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-scripts\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.910864 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-config-data\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.912905 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.913539 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.921575 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:07 crc kubenswrapper[4886]: I0314 08:52:07.938305 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghphc\" (UniqueName: \"kubernetes.io/projected/d46d79e4-6108-4b59-8399-8062975ef0f3-kube-api-access-ghphc\") pod \"ceilometer-0\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " pod="openstack/ceilometer-0" Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.022504 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.300041 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"122fa5d6-e40f-49c2-befb-9ae5e8de5671","Type":"ContainerStarted","Data":"fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb"} Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.300389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"122fa5d6-e40f-49c2-befb-9ae5e8de5671","Type":"ContainerStarted","Data":"4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165"} Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.300404 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"122fa5d6-e40f-49c2-befb-9ae5e8de5671","Type":"ContainerStarted","Data":"49a6117bcc9f3a9c3221fde8dc677fb52713de41037d086915715b02f47d736e"} Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.321585 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.321567466 podStartE2EDuration="2.321567466s" podCreationTimestamp="2026-03-14 08:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:08.321321579 +0000 UTC m=+1463.569773236" watchObservedRunningTime="2026-03-14 08:52:08.321567466 +0000 UTC m=+1463.570019113" Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.477133 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 08:52:08 crc kubenswrapper[4886]: W0314 08:52:08.484441 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd46d79e4_6108_4b59_8399_8062975ef0f3.slice/crio-30bae7c939efcaabea6164b0e5b13e341b440f133a0e21d1ffa226ebf1ad0720 WatchSource:0}: Error finding container 30bae7c939efcaabea6164b0e5b13e341b440f133a0e21d1ffa226ebf1ad0720: Status 404 returned error can't find the container with id 30bae7c939efcaabea6164b0e5b13e341b440f133a0e21d1ffa226ebf1ad0720 Mar 14 08:52:08 crc kubenswrapper[4886]: I0314 08:52:08.493100 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:09 crc kubenswrapper[4886]: I0314 08:52:09.313512 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerStarted","Data":"d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6"} Mar 14 08:52:09 crc kubenswrapper[4886]: I0314 08:52:09.313573 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerStarted","Data":"30bae7c939efcaabea6164b0e5b13e341b440f133a0e21d1ffa226ebf1ad0720"} Mar 14 08:52:09 crc kubenswrapper[4886]: I0314 08:52:09.432078 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5" path="/var/lib/kubelet/pods/5bddb3e5-33a1-4ebb-8f67-5e12a9b902d5/volumes" Mar 14 08:52:10 crc kubenswrapper[4886]: I0314 08:52:10.341812 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerStarted","Data":"67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562"} Mar 14 08:52:11 crc kubenswrapper[4886]: I0314 08:52:11.351187 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerStarted","Data":"ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c"} Mar 14 08:52:11 crc kubenswrapper[4886]: I0314 08:52:11.439803 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 08:52:11 crc kubenswrapper[4886]: I0314 08:52:11.439843 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 08:52:11 crc kubenswrapper[4886]: I0314 08:52:11.487401 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 14 08:52:12 crc kubenswrapper[4886]: I0314 08:52:12.362996 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerStarted","Data":"795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196"} Mar 14 08:52:12 crc kubenswrapper[4886]: I0314 08:52:12.365258 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:52:12 crc kubenswrapper[4886]: I0314 08:52:12.445237 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:12 crc kubenswrapper[4886]: I0314 08:52:12.445309 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:13 crc kubenswrapper[4886]: I0314 08:52:13.478380 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 08:52:13 crc kubenswrapper[4886]: I0314 08:52:13.597747 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 08:52:13 crc kubenswrapper[4886]: I0314 08:52:13.630523 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.201436485 podStartE2EDuration="6.630502127s" podCreationTimestamp="2026-03-14 08:52:07 +0000 UTC" firstStartedPulling="2026-03-14 08:52:08.488886156 +0000 UTC m=+1463.737337793" lastFinishedPulling="2026-03-14 08:52:11.917951798 +0000 UTC m=+1467.166403435" observedRunningTime="2026-03-14 08:52:12.394200359 +0000 UTC m=+1467.642651996" watchObservedRunningTime="2026-03-14 08:52:13.630502127 +0000 UTC m=+1468.878953764" Mar 14 08:52:14 crc kubenswrapper[4886]: I0314 08:52:14.411616 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 08:52:14 crc kubenswrapper[4886]: I0314 08:52:14.803445 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 14 08:52:16 crc kubenswrapper[4886]: I0314 08:52:16.696500 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 08:52:16 crc kubenswrapper[4886]: I0314 08:52:16.696755 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 08:52:17 crc kubenswrapper[4886]: I0314 08:52:17.780389 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:17 crc kubenswrapper[4886]: I0314 08:52:17.780421 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:19 crc kubenswrapper[4886]: I0314 08:52:19.433146 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 08:52:19 crc kubenswrapper[4886]: I0314 08:52:19.433992 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 08:52:21 crc kubenswrapper[4886]: I0314 08:52:21.438831 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 08:52:21 crc kubenswrapper[4886]: I0314 08:52:21.440640 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 08:52:21 crc kubenswrapper[4886]: I0314 08:52:21.446103 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 08:52:22 crc kubenswrapper[4886]: I0314 08:52:22.463020 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.237935 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.331975 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtj5m\" (UniqueName: \"kubernetes.io/projected/b76b6262-2fc6-46a4-abeb-5a380338b6f6-kube-api-access-vtj5m\") pod \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.332155 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-combined-ca-bundle\") pod \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.332228 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-config-data\") pod \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\" (UID: \"b76b6262-2fc6-46a4-abeb-5a380338b6f6\") " Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.427611 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76b6262-2fc6-46a4-abeb-5a380338b6f6-kube-api-access-vtj5m" (OuterVolumeSpecName: "kube-api-access-vtj5m") pod "b76b6262-2fc6-46a4-abeb-5a380338b6f6" (UID: "b76b6262-2fc6-46a4-abeb-5a380338b6f6"). InnerVolumeSpecName "kube-api-access-vtj5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.434301 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtj5m\" (UniqueName: \"kubernetes.io/projected/b76b6262-2fc6-46a4-abeb-5a380338b6f6-kube-api-access-vtj5m\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.462201 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b76b6262-2fc6-46a4-abeb-5a380338b6f6" (UID: "b76b6262-2fc6-46a4-abeb-5a380338b6f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.462787 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-config-data" (OuterVolumeSpecName: "config-data") pod "b76b6262-2fc6-46a4-abeb-5a380338b6f6" (UID: "b76b6262-2fc6-46a4-abeb-5a380338b6f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.477490 4886 generic.go:334] "Generic (PLEG): container finished" podID="b76b6262-2fc6-46a4-abeb-5a380338b6f6" containerID="e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f" exitCode=137 Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.478389 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.488235 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b76b6262-2fc6-46a4-abeb-5a380338b6f6","Type":"ContainerDied","Data":"e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f"} Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.488338 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b76b6262-2fc6-46a4-abeb-5a380338b6f6","Type":"ContainerDied","Data":"37c6c60ac1376cab6802c9f3c611e2987a7b984747018aa9b3bc114540d33e94"} Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.488363 4886 scope.go:117] "RemoveContainer" containerID="e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.530745 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.535753 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.536035 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76b6262-2fc6-46a4-abeb-5a380338b6f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.544427 4886 scope.go:117] "RemoveContainer" containerID="e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f" Mar 14 08:52:24 crc kubenswrapper[4886]: E0314 08:52:24.544999 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f\": container with ID starting with e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f not found: ID does not exist" containerID="e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.545056 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f"} err="failed to get container status \"e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f\": rpc error: code = NotFound desc = could not find container \"e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f\": container with ID starting with e3f1cf806b929180cf6b4ac7f4a9c15e95623422c3200212ceb931817cca244f not found: ID does not exist" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.563237 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.572053 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:52:24 crc kubenswrapper[4886]: E0314 08:52:24.572593 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76b6262-2fc6-46a4-abeb-5a380338b6f6" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.572614 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76b6262-2fc6-46a4-abeb-5a380338b6f6" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.572799 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76b6262-2fc6-46a4-abeb-5a380338b6f6" containerName="nova-cell1-novncproxy-novncproxy" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.573596 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.576274 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.576305 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.576340 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.581257 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.696089 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.696179 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.739811 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.739881 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8kd\" (UniqueName: \"kubernetes.io/projected/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-kube-api-access-kr8kd\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.740110 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.740392 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.740459 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.842390 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.842454 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.842561 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.842623 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8kd\" (UniqueName: \"kubernetes.io/projected/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-kube-api-access-kr8kd\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.842687 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.846379 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.846365 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.846817 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.846966 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.861155 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8kd\" (UniqueName: \"kubernetes.io/projected/e3ede667-4a5f-4d40-9ad4-ab5e4678bf78-kube-api-access-kr8kd\") pod \"nova-cell1-novncproxy-0\" (UID: \"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78\") " pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:24 crc kubenswrapper[4886]: I0314 08:52:24.897864 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:25 crc kubenswrapper[4886]: I0314 08:52:25.363939 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 14 08:52:25 crc kubenswrapper[4886]: W0314 08:52:25.369057 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ede667_4a5f_4d40_9ad4_ab5e4678bf78.slice/crio-ddd8789a515e8850d1a5066250d9bbd3c1c6c6d2addc4f68bd7311d66a93667f WatchSource:0}: Error finding container ddd8789a515e8850d1a5066250d9bbd3c1c6c6d2addc4f68bd7311d66a93667f: Status 404 returned error can't find the container with id ddd8789a515e8850d1a5066250d9bbd3c1c6c6d2addc4f68bd7311d66a93667f Mar 14 08:52:25 crc kubenswrapper[4886]: I0314 08:52:25.449906 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76b6262-2fc6-46a4-abeb-5a380338b6f6" path="/var/lib/kubelet/pods/b76b6262-2fc6-46a4-abeb-5a380338b6f6/volumes" Mar 14 08:52:25 crc kubenswrapper[4886]: I0314 08:52:25.492578 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78","Type":"ContainerStarted","Data":"ddd8789a515e8850d1a5066250d9bbd3c1c6c6d2addc4f68bd7311d66a93667f"} Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.511263 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e3ede667-4a5f-4d40-9ad4-ab5e4678bf78","Type":"ContainerStarted","Data":"bfa4052e17cfb47e8a21728228d76a8f2fede4c917c2cb07d6158c17075abf3b"} Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.539581 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.539563398 podStartE2EDuration="2.539563398s" podCreationTimestamp="2026-03-14 08:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:26.530613934 +0000 UTC m=+1481.779065571" watchObservedRunningTime="2026-03-14 08:52:26.539563398 +0000 UTC m=+1481.788015035" Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.699175 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.699649 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.703651 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.713680 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.954009 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-55dtq"] Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.958228 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:26 crc kubenswrapper[4886]: I0314 08:52:26.992167 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-55dtq"] Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.146819 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.146890 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.146954 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lh79\" (UniqueName: \"kubernetes.io/projected/ec356daa-f053-45d9-8297-1df7fa8621a0-kube-api-access-8lh79\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.147261 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.147357 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.147385 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-config\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.250040 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.250148 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lh79\" (UniqueName: \"kubernetes.io/projected/ec356daa-f053-45d9-8297-1df7fa8621a0-kube-api-access-8lh79\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.250246 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.250299 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.250327 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-config\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.250410 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.251399 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-config\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.251510 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.251511 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.251513 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.251604 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.271366 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lh79\" (UniqueName: \"kubernetes.io/projected/ec356daa-f053-45d9-8297-1df7fa8621a0-kube-api-access-8lh79\") pod \"dnsmasq-dns-5c7b6c5df9-55dtq\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.285890 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:27 crc kubenswrapper[4886]: I0314 08:52:27.738549 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-55dtq"] Mar 14 08:52:27 crc kubenswrapper[4886]: W0314 08:52:27.741260 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec356daa_f053_45d9_8297_1df7fa8621a0.slice/crio-2173156f96691102c2fa4fbd0b0223d4a2638bb221e1b61c3491c9e89d4f3f79 WatchSource:0}: Error finding container 2173156f96691102c2fa4fbd0b0223d4a2638bb221e1b61c3491c9e89d4f3f79: Status 404 returned error can't find the container with id 2173156f96691102c2fa4fbd0b0223d4a2638bb221e1b61c3491c9e89d4f3f79 Mar 14 08:52:28 crc kubenswrapper[4886]: I0314 08:52:28.533533 4886 generic.go:334] "Generic (PLEG): container finished" podID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerID="1efd48189dec685f2c073f2a412ef204b69509bd12949b87bac70fb1aaeae73b" exitCode=0 Mar 14 08:52:28 crc kubenswrapper[4886]: I0314 08:52:28.534019 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" event={"ID":"ec356daa-f053-45d9-8297-1df7fa8621a0","Type":"ContainerDied","Data":"1efd48189dec685f2c073f2a412ef204b69509bd12949b87bac70fb1aaeae73b"} Mar 14 08:52:28 crc kubenswrapper[4886]: I0314 08:52:28.534143 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" event={"ID":"ec356daa-f053-45d9-8297-1df7fa8621a0","Type":"ContainerStarted","Data":"2173156f96691102c2fa4fbd0b0223d4a2638bb221e1b61c3491c9e89d4f3f79"} Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.043647 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.044540 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-central-agent" containerID="cri-o://d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6" gracePeriod=30 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.045327 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-notification-agent" containerID="cri-o://67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562" gracePeriod=30 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.045355 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="sg-core" containerID="cri-o://ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c" gracePeriod=30 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.045420 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="proxy-httpd" containerID="cri-o://795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196" gracePeriod=30 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.145735 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.229:3000/\": read tcp 10.217.0.2:55954->10.217.0.229:3000: read: connection reset by peer" Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.517537 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.545191 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerID="795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196" exitCode=0 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.545224 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerID="ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c" exitCode=2 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.545234 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerID="d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6" exitCode=0 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.545270 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerDied","Data":"795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196"} Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.545408 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerDied","Data":"ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c"} Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.545428 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerDied","Data":"d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6"} Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.547208 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" event={"ID":"ec356daa-f053-45d9-8297-1df7fa8621a0","Type":"ContainerStarted","Data":"c5389940f167ee63e60741f69efef5ec57b5085d1e7db0fa8a7f377465cdf669"} Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.547378 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-log" containerID="cri-o://4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165" gracePeriod=30 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.547506 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-api" containerID="cri-o://fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb" gracePeriod=30 Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.570588 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" podStartSLOduration=3.570543747 podStartE2EDuration="3.570543747s" podCreationTimestamp="2026-03-14 08:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:29.566211804 +0000 UTC m=+1484.814663461" watchObservedRunningTime="2026-03-14 08:52:29.570543747 +0000 UTC m=+1484.818995394" Mar 14 08:52:29 crc kubenswrapper[4886]: I0314 08:52:29.898876 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.063979 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.104446 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-sg-core-conf-yaml\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105092 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-ceilometer-tls-certs\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105151 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-scripts\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105179 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-combined-ca-bundle\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105312 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-config-data\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105382 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-log-httpd\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105540 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-run-httpd\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.105565 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghphc\" (UniqueName: \"kubernetes.io/projected/d46d79e4-6108-4b59-8399-8062975ef0f3-kube-api-access-ghphc\") pod \"d46d79e4-6108-4b59-8399-8062975ef0f3\" (UID: \"d46d79e4-6108-4b59-8399-8062975ef0f3\") " Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.108609 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.108967 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.118387 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46d79e4-6108-4b59-8399-8062975ef0f3-kube-api-access-ghphc" (OuterVolumeSpecName: "kube-api-access-ghphc") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "kube-api-access-ghphc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.149283 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-scripts" (OuterVolumeSpecName: "scripts") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.187045 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.206879 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.206913 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghphc\" (UniqueName: \"kubernetes.io/projected/d46d79e4-6108-4b59-8399-8062975ef0f3-kube-api-access-ghphc\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.206926 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d46d79e4-6108-4b59-8399-8062975ef0f3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.206936 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.206944 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.208673 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.230647 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.265767 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-config-data" (OuterVolumeSpecName: "config-data") pod "d46d79e4-6108-4b59-8399-8062975ef0f3" (UID: "d46d79e4-6108-4b59-8399-8062975ef0f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.309597 4886 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.310263 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.310421 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d46d79e4-6108-4b59-8399-8062975ef0f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.530810 4886 scope.go:117] "RemoveContainer" containerID="756de37cbd5f5e14f06c8cbf1d068336c4156f6c08dba8123691c00b5d05b006" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.558455 4886 generic.go:334] "Generic (PLEG): container finished" podID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerID="67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562" exitCode=0 Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.558526 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.558521 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerDied","Data":"67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562"} Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.558977 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d46d79e4-6108-4b59-8399-8062975ef0f3","Type":"ContainerDied","Data":"30bae7c939efcaabea6164b0e5b13e341b440f133a0e21d1ffa226ebf1ad0720"} Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.559022 4886 scope.go:117] "RemoveContainer" containerID="795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.573861 4886 generic.go:334] "Generic (PLEG): container finished" podID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerID="4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165" exitCode=143 Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.573933 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"122fa5d6-e40f-49c2-befb-9ae5e8de5671","Type":"ContainerDied","Data":"4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165"} Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.574357 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.632172 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.650349 4886 scope.go:117] "RemoveContainer" containerID="ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.652240 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.664843 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.665349 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-central-agent" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665366 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-central-agent" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.665401 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="sg-core" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665410 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="sg-core" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.665429 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-notification-agent" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665438 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-notification-agent" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.665465 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="proxy-httpd" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665472 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="proxy-httpd" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665688 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-notification-agent" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665711 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="sg-core" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665724 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="ceilometer-central-agent" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.665732 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" containerName="proxy-httpd" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.667938 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.670942 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.671281 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.674711 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.683386 4886 scope.go:117] "RemoveContainer" containerID="67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.695172 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719268 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-config-data\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719310 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-log-httpd\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719353 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719414 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-scripts\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719435 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4t8\" (UniqueName: \"kubernetes.io/projected/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-kube-api-access-bn4t8\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719474 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-run-httpd\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719500 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.719527 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.735276 4886 scope.go:117] "RemoveContainer" containerID="d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.765728 4886 scope.go:117] "RemoveContainer" containerID="795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.766217 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196\": container with ID starting with 795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196 not found: ID does not exist" containerID="795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.766247 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196"} err="failed to get container status \"795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196\": rpc error: code = NotFound desc = could not find container \"795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196\": container with ID starting with 795e180c664833711c234291612d01f00620fcdcc6ec43700f98ef320bda4196 not found: ID does not exist" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.766274 4886 scope.go:117] "RemoveContainer" containerID="ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.766495 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c\": container with ID starting with ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c not found: ID does not exist" containerID="ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.766525 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c"} err="failed to get container status \"ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c\": rpc error: code = NotFound desc = could not find container \"ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c\": container with ID starting with ea665f665b7f8ae18642b7623c380d3edbf9419bfe94515517643c1cbcb74e7c not found: ID does not exist" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.766539 4886 scope.go:117] "RemoveContainer" containerID="67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.766748 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562\": container with ID starting with 67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562 not found: ID does not exist" containerID="67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.766771 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562"} err="failed to get container status \"67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562\": rpc error: code = NotFound desc = could not find container \"67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562\": container with ID starting with 67ce3fac6461b122ff29ecee87a5aed813e0a3730ef4c0f066325806982c6562 not found: ID does not exist" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.766785 4886 scope.go:117] "RemoveContainer" containerID="d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6" Mar 14 08:52:30 crc kubenswrapper[4886]: E0314 08:52:30.767099 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6\": container with ID starting with d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6 not found: ID does not exist" containerID="d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.767133 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6"} err="failed to get container status \"d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6\": rpc error: code = NotFound desc = could not find container \"d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6\": container with ID starting with d85ab20b27a45ceb89fd9b99734e64271c735796dc986570d9e3b9f1220148b6 not found: ID does not exist" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.820943 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-config-data\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.820986 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-log-httpd\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821030 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821767 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-scripts\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821799 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4t8\" (UniqueName: \"kubernetes.io/projected/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-kube-api-access-bn4t8\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821840 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-run-httpd\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821867 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821892 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.821585 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-log-httpd\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.825428 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-run-httpd\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.825428 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-config-data\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.829985 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.830793 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.832502 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-scripts\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.838335 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:30 crc kubenswrapper[4886]: I0314 08:52:30.845241 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4t8\" (UniqueName: \"kubernetes.io/projected/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-kube-api-access-bn4t8\") pod \"ceilometer-0\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " pod="openstack/ceilometer-0" Mar 14 08:52:31 crc kubenswrapper[4886]: I0314 08:52:31.019838 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:52:31 crc kubenswrapper[4886]: I0314 08:52:31.258152 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:31 crc kubenswrapper[4886]: I0314 08:52:31.436967 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46d79e4-6108-4b59-8399-8062975ef0f3" path="/var/lib/kubelet/pods/d46d79e4-6108-4b59-8399-8062975ef0f3/volumes" Mar 14 08:52:31 crc kubenswrapper[4886]: I0314 08:52:31.478784 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:52:31 crc kubenswrapper[4886]: W0314 08:52:31.489619 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dbe3aa4_6aa6_43bd_89ec_5e549667b02b.slice/crio-b0ba87029f8ac7738246c892aa53908f85abdd84dee24fd8c0216c1578f5b2b2 WatchSource:0}: Error finding container b0ba87029f8ac7738246c892aa53908f85abdd84dee24fd8c0216c1578f5b2b2: Status 404 returned error can't find the container with id b0ba87029f8ac7738246c892aa53908f85abdd84dee24fd8c0216c1578f5b2b2 Mar 14 08:52:31 crc kubenswrapper[4886]: I0314 08:52:31.584419 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerStarted","Data":"b0ba87029f8ac7738246c892aa53908f85abdd84dee24fd8c0216c1578f5b2b2"} Mar 14 08:52:32 crc kubenswrapper[4886]: I0314 08:52:32.593389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerStarted","Data":"fdf93410e70fb0ee72eaa42914cc4ae6b6ea838f68db3ddab0e12a55e1214619"} Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.123784 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.169829 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-combined-ca-bundle\") pod \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.169966 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-config-data\") pod \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.170097 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122fa5d6-e40f-49c2-befb-9ae5e8de5671-logs\") pod \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.170213 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnt5h\" (UniqueName: \"kubernetes.io/projected/122fa5d6-e40f-49c2-befb-9ae5e8de5671-kube-api-access-jnt5h\") pod \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\" (UID: \"122fa5d6-e40f-49c2-befb-9ae5e8de5671\") " Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.171245 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122fa5d6-e40f-49c2-befb-9ae5e8de5671-logs" (OuterVolumeSpecName: "logs") pod "122fa5d6-e40f-49c2-befb-9ae5e8de5671" (UID: "122fa5d6-e40f-49c2-befb-9ae5e8de5671"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.173557 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122fa5d6-e40f-49c2-befb-9ae5e8de5671-kube-api-access-jnt5h" (OuterVolumeSpecName: "kube-api-access-jnt5h") pod "122fa5d6-e40f-49c2-befb-9ae5e8de5671" (UID: "122fa5d6-e40f-49c2-befb-9ae5e8de5671"). InnerVolumeSpecName "kube-api-access-jnt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.200464 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-config-data" (OuterVolumeSpecName: "config-data") pod "122fa5d6-e40f-49c2-befb-9ae5e8de5671" (UID: "122fa5d6-e40f-49c2-befb-9ae5e8de5671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.208172 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "122fa5d6-e40f-49c2-befb-9ae5e8de5671" (UID: "122fa5d6-e40f-49c2-befb-9ae5e8de5671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.272265 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnt5h\" (UniqueName: \"kubernetes.io/projected/122fa5d6-e40f-49c2-befb-9ae5e8de5671-kube-api-access-jnt5h\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.272297 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.272306 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122fa5d6-e40f-49c2-befb-9ae5e8de5671-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.272315 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122fa5d6-e40f-49c2-befb-9ae5e8de5671-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.606637 4886 generic.go:334] "Generic (PLEG): container finished" podID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerID="fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb" exitCode=0 Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.606696 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"122fa5d6-e40f-49c2-befb-9ae5e8de5671","Type":"ContainerDied","Data":"fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb"} Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.606723 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"122fa5d6-e40f-49c2-befb-9ae5e8de5671","Type":"ContainerDied","Data":"49a6117bcc9f3a9c3221fde8dc677fb52713de41037d086915715b02f47d736e"} Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.606739 4886 scope.go:117] "RemoveContainer" containerID="fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.606847 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.611483 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerStarted","Data":"0a4cfdd70492f8086340720be8ee6581d4a9cf019f2bdb84200efe061c2280a2"} Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.640882 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.642514 4886 scope.go:117] "RemoveContainer" containerID="4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.655276 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.679426 4886 scope.go:117] "RemoveContainer" containerID="fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb" Mar 14 08:52:33 crc kubenswrapper[4886]: E0314 08:52:33.681282 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb\": container with ID starting with fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb not found: ID does not exist" containerID="fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.681316 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb"} err="failed to get container status \"fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb\": rpc error: code = NotFound desc = could not find container \"fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb\": container with ID starting with fdc87182a5db08a66f518ef9e32b57647666d05999f18275e1ffd755330d7cdb not found: ID does not exist" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.681337 4886 scope.go:117] "RemoveContainer" containerID="4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165" Mar 14 08:52:33 crc kubenswrapper[4886]: E0314 08:52:33.681617 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165\": container with ID starting with 4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165 not found: ID does not exist" containerID="4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.681656 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165"} err="failed to get container status \"4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165\": rpc error: code = NotFound desc = could not find container \"4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165\": container with ID starting with 4024d85ca563973bb2be8952317f186f64ff63707adede34ef4fd9af5f178165 not found: ID does not exist" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.704324 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:33 crc kubenswrapper[4886]: E0314 08:52:33.705206 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-log" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.705239 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-log" Mar 14 08:52:33 crc kubenswrapper[4886]: E0314 08:52:33.705256 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-api" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.705262 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-api" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.705443 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-api" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.705458 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" containerName="nova-api-log" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.706716 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.708722 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.709382 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.709546 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.717292 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.782305 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt76h\" (UniqueName: \"kubernetes.io/projected/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-kube-api-access-pt76h\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.782369 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.782742 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-logs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.782901 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-config-data\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.782944 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.782980 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.884779 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-config-data\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.884817 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.884838 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.884903 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt76h\" (UniqueName: \"kubernetes.io/projected/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-kube-api-access-pt76h\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.884930 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.884996 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-logs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.885409 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-logs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.889600 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.893432 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-config-data\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.894496 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.894983 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:33 crc kubenswrapper[4886]: I0314 08:52:33.903547 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt76h\" (UniqueName: \"kubernetes.io/projected/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-kube-api-access-pt76h\") pod \"nova-api-0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " pod="openstack/nova-api-0" Mar 14 08:52:34 crc kubenswrapper[4886]: I0314 08:52:34.045602 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:34 crc kubenswrapper[4886]: I0314 08:52:34.536366 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:34 crc kubenswrapper[4886]: W0314 08:52:34.539886 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8f306c_569a_4bf1_b6bf_bea8d0cf55e0.slice/crio-66317294a1f9c78c4d10627ba6075e4e984d5626039cbf70287ba21c157a57da WatchSource:0}: Error finding container 66317294a1f9c78c4d10627ba6075e4e984d5626039cbf70287ba21c157a57da: Status 404 returned error can't find the container with id 66317294a1f9c78c4d10627ba6075e4e984d5626039cbf70287ba21c157a57da Mar 14 08:52:34 crc kubenswrapper[4886]: I0314 08:52:34.620113 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0","Type":"ContainerStarted","Data":"66317294a1f9c78c4d10627ba6075e4e984d5626039cbf70287ba21c157a57da"} Mar 14 08:52:34 crc kubenswrapper[4886]: I0314 08:52:34.628317 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerStarted","Data":"f3c48b74802dceb2f9e91de28bc034fc5889de66fc2c84c993c2d754440871f0"} Mar 14 08:52:34 crc kubenswrapper[4886]: I0314 08:52:34.899019 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:34 crc kubenswrapper[4886]: I0314 08:52:34.918221 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.454631 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122fa5d6-e40f-49c2-befb-9ae5e8de5671" path="/var/lib/kubelet/pods/122fa5d6-e40f-49c2-befb-9ae5e8de5671/volumes" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.643790 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerStarted","Data":"299e8007ec0ea2617cb03f2e9e5ec10edbfe3ca69ac94163cc4da94555aac5a6"} Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.644159 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-central-agent" containerID="cri-o://fdf93410e70fb0ee72eaa42914cc4ae6b6ea838f68db3ddab0e12a55e1214619" gracePeriod=30 Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.644310 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.644744 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="proxy-httpd" containerID="cri-o://299e8007ec0ea2617cb03f2e9e5ec10edbfe3ca69ac94163cc4da94555aac5a6" gracePeriod=30 Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.644805 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="sg-core" containerID="cri-o://f3c48b74802dceb2f9e91de28bc034fc5889de66fc2c84c993c2d754440871f0" gracePeriod=30 Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.644857 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-notification-agent" containerID="cri-o://0a4cfdd70492f8086340720be8ee6581d4a9cf019f2bdb84200efe061c2280a2" gracePeriod=30 Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.656216 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0","Type":"ContainerStarted","Data":"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583"} Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.656322 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0","Type":"ContainerStarted","Data":"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47"} Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.667604 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.820690569 podStartE2EDuration="5.667586903s" podCreationTimestamp="2026-03-14 08:52:30 +0000 UTC" firstStartedPulling="2026-03-14 08:52:31.492992116 +0000 UTC m=+1486.741443753" lastFinishedPulling="2026-03-14 08:52:35.33988845 +0000 UTC m=+1490.588340087" observedRunningTime="2026-03-14 08:52:35.663977361 +0000 UTC m=+1490.912428998" watchObservedRunningTime="2026-03-14 08:52:35.667586903 +0000 UTC m=+1490.916038530" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.676078 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.688039 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.688013433 podStartE2EDuration="2.688013433s" podCreationTimestamp="2026-03-14 08:52:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:35.684763611 +0000 UTC m=+1490.933215248" watchObservedRunningTime="2026-03-14 08:52:35.688013433 +0000 UTC m=+1490.936465070" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.839467 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-r5gnt"] Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.841062 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.843069 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.844326 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.859991 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r5gnt"] Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.925840 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.925914 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-config-data\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.926091 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-scripts\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:35 crc kubenswrapper[4886]: I0314 08:52:35.926168 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66qgb\" (UniqueName: \"kubernetes.io/projected/392f2ed5-f494-4474-832a-a208bd72b1fa-kube-api-access-66qgb\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.028364 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.028421 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-config-data\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.028514 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-scripts\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.028542 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66qgb\" (UniqueName: \"kubernetes.io/projected/392f2ed5-f494-4474-832a-a208bd72b1fa-kube-api-access-66qgb\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.033332 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-scripts\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.035783 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-config-data\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.037730 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.051621 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66qgb\" (UniqueName: \"kubernetes.io/projected/392f2ed5-f494-4474-832a-a208bd72b1fa-kube-api-access-66qgb\") pod \"nova-cell1-cell-mapping-r5gnt\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.158652 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.667579 4886 generic.go:334] "Generic (PLEG): container finished" podID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerID="f3c48b74802dceb2f9e91de28bc034fc5889de66fc2c84c993c2d754440871f0" exitCode=2 Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.668027 4886 generic.go:334] "Generic (PLEG): container finished" podID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerID="0a4cfdd70492f8086340720be8ee6581d4a9cf019f2bdb84200efe061c2280a2" exitCode=0 Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.667649 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerDied","Data":"f3c48b74802dceb2f9e91de28bc034fc5889de66fc2c84c993c2d754440871f0"} Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.668070 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerDied","Data":"0a4cfdd70492f8086340720be8ee6581d4a9cf019f2bdb84200efe061c2280a2"} Mar 14 08:52:36 crc kubenswrapper[4886]: W0314 08:52:36.667641 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392f2ed5_f494_4474_832a_a208bd72b1fa.slice/crio-f361a80953aaf1ab768ee05fb9f2ce7bb9b0ad8fe010f6663af5670dc3569c73 WatchSource:0}: Error finding container f361a80953aaf1ab768ee05fb9f2ce7bb9b0ad8fe010f6663af5670dc3569c73: Status 404 returned error can't find the container with id f361a80953aaf1ab768ee05fb9f2ce7bb9b0ad8fe010f6663af5670dc3569c73 Mar 14 08:52:36 crc kubenswrapper[4886]: I0314 08:52:36.669494 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r5gnt"] Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.287230 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.375133 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-vd5jl"] Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.375822 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerName="dnsmasq-dns" containerID="cri-o://3f72fd49cfb9a8051e9104eca68870c12e343f61115936ad9291981418484637" gracePeriod=10 Mar 14 08:52:37 crc kubenswrapper[4886]: E0314 08:52:37.499880 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fa7a09a_b8ef_4cdc_a4ce_93287d730311.slice/crio-3f72fd49cfb9a8051e9104eca68870c12e343f61115936ad9291981418484637.scope\": RecentStats: unable to find data in memory cache]" Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.697296 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r5gnt" event={"ID":"392f2ed5-f494-4474-832a-a208bd72b1fa","Type":"ContainerStarted","Data":"5a080999a098b752548f15b262235bd5172c50585c336f8b1d3a19c433046030"} Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.698242 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r5gnt" event={"ID":"392f2ed5-f494-4474-832a-a208bd72b1fa","Type":"ContainerStarted","Data":"f361a80953aaf1ab768ee05fb9f2ce7bb9b0ad8fe010f6663af5670dc3569c73"} Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.704680 4886 generic.go:334] "Generic (PLEG): container finished" podID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerID="3f72fd49cfb9a8051e9104eca68870c12e343f61115936ad9291981418484637" exitCode=0 Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.704734 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" event={"ID":"9fa7a09a-b8ef-4cdc-a4ce-93287d730311","Type":"ContainerDied","Data":"3f72fd49cfb9a8051e9104eca68870c12e343f61115936ad9291981418484637"} Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.720222 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-r5gnt" podStartSLOduration=2.720202068 podStartE2EDuration="2.720202068s" podCreationTimestamp="2026-03-14 08:52:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:37.711373487 +0000 UTC m=+1492.959825124" watchObservedRunningTime="2026-03-14 08:52:37.720202068 +0000 UTC m=+1492.968653705" Mar 14 08:52:37 crc kubenswrapper[4886]: I0314 08:52:37.930832 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.071372 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-swift-storage-0\") pod \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.071441 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-config\") pod \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.071535 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-nb\") pod \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.071584 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-svc\") pod \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.071616 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-sb\") pod \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.071759 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rkx6\" (UniqueName: \"kubernetes.io/projected/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-kube-api-access-4rkx6\") pod \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\" (UID: \"9fa7a09a-b8ef-4cdc-a4ce-93287d730311\") " Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.078260 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-kube-api-access-4rkx6" (OuterVolumeSpecName: "kube-api-access-4rkx6") pod "9fa7a09a-b8ef-4cdc-a4ce-93287d730311" (UID: "9fa7a09a-b8ef-4cdc-a4ce-93287d730311"). InnerVolumeSpecName "kube-api-access-4rkx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.145996 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fa7a09a-b8ef-4cdc-a4ce-93287d730311" (UID: "9fa7a09a-b8ef-4cdc-a4ce-93287d730311"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.148333 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fa7a09a-b8ef-4cdc-a4ce-93287d730311" (UID: "9fa7a09a-b8ef-4cdc-a4ce-93287d730311"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.151554 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fa7a09a-b8ef-4cdc-a4ce-93287d730311" (UID: "9fa7a09a-b8ef-4cdc-a4ce-93287d730311"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.155447 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9fa7a09a-b8ef-4cdc-a4ce-93287d730311" (UID: "9fa7a09a-b8ef-4cdc-a4ce-93287d730311"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.170432 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-config" (OuterVolumeSpecName: "config") pod "9fa7a09a-b8ef-4cdc-a4ce-93287d730311" (UID: "9fa7a09a-b8ef-4cdc-a4ce-93287d730311"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.176774 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.176820 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.176832 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.176844 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.176852 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.176860 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rkx6\" (UniqueName: \"kubernetes.io/projected/9fa7a09a-b8ef-4cdc-a4ce-93287d730311-kube-api-access-4rkx6\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.722459 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.723519 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-vd5jl" event={"ID":"9fa7a09a-b8ef-4cdc-a4ce-93287d730311","Type":"ContainerDied","Data":"d46d00f96eb1be0391afb0324f1796b9d4b658dd46723bd128b3e4ed843344b6"} Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.723587 4886 scope.go:117] "RemoveContainer" containerID="3f72fd49cfb9a8051e9104eca68870c12e343f61115936ad9291981418484637" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.744686 4886 scope.go:117] "RemoveContainer" containerID="dd06c67843b49a59a73609ae74916eb6ceab2503a9c018c6169b21b452de7303" Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.764208 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-vd5jl"] Mar 14 08:52:38 crc kubenswrapper[4886]: I0314 08:52:38.776018 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-vd5jl"] Mar 14 08:52:39 crc kubenswrapper[4886]: I0314 08:52:39.432295 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" path="/var/lib/kubelet/pods/9fa7a09a-b8ef-4cdc-a4ce-93287d730311/volumes" Mar 14 08:52:41 crc kubenswrapper[4886]: I0314 08:52:41.749018 4886 generic.go:334] "Generic (PLEG): container finished" podID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerID="fdf93410e70fb0ee72eaa42914cc4ae6b6ea838f68db3ddab0e12a55e1214619" exitCode=0 Mar 14 08:52:41 crc kubenswrapper[4886]: I0314 08:52:41.749160 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerDied","Data":"fdf93410e70fb0ee72eaa42914cc4ae6b6ea838f68db3ddab0e12a55e1214619"} Mar 14 08:52:41 crc kubenswrapper[4886]: I0314 08:52:41.752565 4886 generic.go:334] "Generic (PLEG): container finished" podID="392f2ed5-f494-4474-832a-a208bd72b1fa" containerID="5a080999a098b752548f15b262235bd5172c50585c336f8b1d3a19c433046030" exitCode=0 Mar 14 08:52:41 crc kubenswrapper[4886]: I0314 08:52:41.752654 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r5gnt" event={"ID":"392f2ed5-f494-4474-832a-a208bd72b1fa","Type":"ContainerDied","Data":"5a080999a098b752548f15b262235bd5172c50585c336f8b1d3a19c433046030"} Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.150790 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.280030 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-scripts\") pod \"392f2ed5-f494-4474-832a-a208bd72b1fa\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.280178 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-config-data\") pod \"392f2ed5-f494-4474-832a-a208bd72b1fa\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.280253 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-combined-ca-bundle\") pod \"392f2ed5-f494-4474-832a-a208bd72b1fa\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.280375 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66qgb\" (UniqueName: \"kubernetes.io/projected/392f2ed5-f494-4474-832a-a208bd72b1fa-kube-api-access-66qgb\") pod \"392f2ed5-f494-4474-832a-a208bd72b1fa\" (UID: \"392f2ed5-f494-4474-832a-a208bd72b1fa\") " Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.286384 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-scripts" (OuterVolumeSpecName: "scripts") pod "392f2ed5-f494-4474-832a-a208bd72b1fa" (UID: "392f2ed5-f494-4474-832a-a208bd72b1fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.292397 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392f2ed5-f494-4474-832a-a208bd72b1fa-kube-api-access-66qgb" (OuterVolumeSpecName: "kube-api-access-66qgb") pod "392f2ed5-f494-4474-832a-a208bd72b1fa" (UID: "392f2ed5-f494-4474-832a-a208bd72b1fa"). InnerVolumeSpecName "kube-api-access-66qgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.308355 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-config-data" (OuterVolumeSpecName: "config-data") pod "392f2ed5-f494-4474-832a-a208bd72b1fa" (UID: "392f2ed5-f494-4474-832a-a208bd72b1fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.326918 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392f2ed5-f494-4474-832a-a208bd72b1fa" (UID: "392f2ed5-f494-4474-832a-a208bd72b1fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.382478 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.382511 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66qgb\" (UniqueName: \"kubernetes.io/projected/392f2ed5-f494-4474-832a-a208bd72b1fa-kube-api-access-66qgb\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.382524 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.382533 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f2ed5-f494-4474-832a-a208bd72b1fa-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.778227 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r5gnt" event={"ID":"392f2ed5-f494-4474-832a-a208bd72b1fa","Type":"ContainerDied","Data":"f361a80953aaf1ab768ee05fb9f2ce7bb9b0ad8fe010f6663af5670dc3569c73"} Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.778477 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f361a80953aaf1ab768ee05fb9f2ce7bb9b0ad8fe010f6663af5670dc3569c73" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.778332 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r5gnt" Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.956966 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.957217 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0cc17f47-abd0-4f59-a626-d40a5a83f9cb" containerName="nova-scheduler-scheduler" containerID="cri-o://32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856" gracePeriod=30 Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.975157 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.975453 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-log" containerID="cri-o://4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47" gracePeriod=30 Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.975582 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-api" containerID="cri-o://d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583" gracePeriod=30 Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.989522 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.989779 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-log" containerID="cri-o://a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f" gracePeriod=30 Mar 14 08:52:43 crc kubenswrapper[4886]: I0314 08:52:43.989882 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-metadata" containerID="cri-o://2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e" gracePeriod=30 Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.588001 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.609416 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-internal-tls-certs\") pod \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.609491 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt76h\" (UniqueName: \"kubernetes.io/projected/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-kube-api-access-pt76h\") pod \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.609631 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-public-tls-certs\") pod \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.609667 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-config-data\") pod \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.609732 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-combined-ca-bundle\") pod \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.609779 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-logs\") pod \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\" (UID: \"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0\") " Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.610241 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-logs" (OuterVolumeSpecName: "logs") pod "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" (UID: "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.610700 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.620337 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-kube-api-access-pt76h" (OuterVolumeSpecName: "kube-api-access-pt76h") pod "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" (UID: "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0"). InnerVolumeSpecName "kube-api-access-pt76h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.641326 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" (UID: "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.658429 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-config-data" (OuterVolumeSpecName: "config-data") pod "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" (UID: "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.665739 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" (UID: "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.684656 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" (UID: "ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.712229 4886 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.712261 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt76h\" (UniqueName: \"kubernetes.io/projected/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-kube-api-access-pt76h\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.712280 4886 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.712289 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.712299 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.787882 4886 generic.go:334] "Generic (PLEG): container finished" podID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerID="d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583" exitCode=0 Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.787911 4886 generic.go:334] "Generic (PLEG): container finished" podID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerID="4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47" exitCode=143 Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.787979 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0","Type":"ContainerDied","Data":"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583"} Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.788041 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0","Type":"ContainerDied","Data":"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47"} Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.788055 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0","Type":"ContainerDied","Data":"66317294a1f9c78c4d10627ba6075e4e984d5626039cbf70287ba21c157a57da"} Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.788075 4886 scope.go:117] "RemoveContainer" containerID="d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.788436 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.789809 4886 generic.go:334] "Generic (PLEG): container finished" podID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerID="a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f" exitCode=143 Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.789848 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43a601c8-c71a-41d8-b877-75ef0e2ac892","Type":"ContainerDied","Data":"a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f"} Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.811228 4886 scope.go:117] "RemoveContainer" containerID="4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.822643 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.833633 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.845264 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.845740 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerName="init" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.845757 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerName="init" Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.845792 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-log" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.845799 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-log" Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.845807 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392f2ed5-f494-4474-832a-a208bd72b1fa" containerName="nova-manage" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.845814 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="392f2ed5-f494-4474-832a-a208bd72b1fa" containerName="nova-manage" Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.845824 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-api" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.845830 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-api" Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.845846 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerName="dnsmasq-dns" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.845851 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerName="dnsmasq-dns" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.846015 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-log" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.846039 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="392f2ed5-f494-4474-832a-a208bd72b1fa" containerName="nova-manage" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.846050 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa7a09a-b8ef-4cdc-a4ce-93287d730311" containerName="dnsmasq-dns" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.846058 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" containerName="nova-api-api" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.847146 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.847813 4886 scope.go:117] "RemoveContainer" containerID="d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583" Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.848320 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583\": container with ID starting with d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583 not found: ID does not exist" containerID="d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.848357 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583"} err="failed to get container status \"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583\": rpc error: code = NotFound desc = could not find container \"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583\": container with ID starting with d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583 not found: ID does not exist" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.848386 4886 scope.go:117] "RemoveContainer" containerID="4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47" Mar 14 08:52:44 crc kubenswrapper[4886]: E0314 08:52:44.848629 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47\": container with ID starting with 4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47 not found: ID does not exist" containerID="4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.848655 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47"} err="failed to get container status \"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47\": rpc error: code = NotFound desc = could not find container \"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47\": container with ID starting with 4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47 not found: ID does not exist" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.848672 4886 scope.go:117] "RemoveContainer" containerID="d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.848917 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583"} err="failed to get container status \"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583\": rpc error: code = NotFound desc = could not find container \"d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583\": container with ID starting with d5c978e3aa98fcfe0d5b0ce9104a84cfe2367aa91b8d59e5815ba0e79fdc0583 not found: ID does not exist" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.848943 4886 scope.go:117] "RemoveContainer" containerID="4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.849236 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47"} err="failed to get container status \"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47\": rpc error: code = NotFound desc = could not find container \"4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47\": container with ID starting with 4a04a60fbe93dd5c71ffdfc8044d8e397013a5957b6d677517298ca50b2d4f47 not found: ID does not exist" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.850471 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.850970 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.853498 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 14 08:52:44 crc kubenswrapper[4886]: I0314 08:52:44.858936 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.017499 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.017572 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcgts\" (UniqueName: \"kubernetes.io/projected/4d9f64aa-3e9d-422f-a81e-a22d00914728-kube-api-access-bcgts\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.017612 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.017632 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9f64aa-3e9d-422f-a81e-a22d00914728-logs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.017685 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.017702 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-config-data\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.119736 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.119827 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-config-data\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.120036 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.120188 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcgts\" (UniqueName: \"kubernetes.io/projected/4d9f64aa-3e9d-422f-a81e-a22d00914728-kube-api-access-bcgts\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.120290 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.120345 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9f64aa-3e9d-422f-a81e-a22d00914728-logs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.121434 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d9f64aa-3e9d-422f-a81e-a22d00914728-logs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.124573 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.125093 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.128029 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-config-data\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.136565 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d9f64aa-3e9d-422f-a81e-a22d00914728-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.140750 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcgts\" (UniqueName: \"kubernetes.io/projected/4d9f64aa-3e9d-422f-a81e-a22d00914728-kube-api-access-bcgts\") pod \"nova-api-0\" (UID: \"4d9f64aa-3e9d-422f-a81e-a22d00914728\") " pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.176135 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.452817 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0" path="/var/lib/kubelet/pods/ff8f306c-569a-4bf1-b6bf-bea8d0cf55e0/volumes" Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.618790 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.801024 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9f64aa-3e9d-422f-a81e-a22d00914728","Type":"ContainerStarted","Data":"3cb2a8babef32345248771c50456a3f5fbe3c67dce674431cb91aaa7bb047c76"} Mar 14 08:52:45 crc kubenswrapper[4886]: I0314 08:52:45.801400 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9f64aa-3e9d-422f-a81e-a22d00914728","Type":"ContainerStarted","Data":"47428d542c0df9211aaf2362d651fa8df6e9a1a73e202ef066474e7cda20d10d"} Mar 14 08:52:46 crc kubenswrapper[4886]: I0314 08:52:46.821303 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d9f64aa-3e9d-422f-a81e-a22d00914728","Type":"ContainerStarted","Data":"e4ccbe8d23dfc121a82c292087c836df0f0c91f3476a1abc26152793287b5418"} Mar 14 08:52:46 crc kubenswrapper[4886]: I0314 08:52:46.848881 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.848860702 podStartE2EDuration="2.848860702s" podCreationTimestamp="2026-03-14 08:52:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:46.840854325 +0000 UTC m=+1502.089305982" watchObservedRunningTime="2026-03-14 08:52:46.848860702 +0000 UTC m=+1502.097312339" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.635761 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.687622 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a601c8-c71a-41d8-b877-75ef0e2ac892-logs\") pod \"43a601c8-c71a-41d8-b877-75ef0e2ac892\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.687845 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl55j\" (UniqueName: \"kubernetes.io/projected/43a601c8-c71a-41d8-b877-75ef0e2ac892-kube-api-access-bl55j\") pod \"43a601c8-c71a-41d8-b877-75ef0e2ac892\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.688171 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-combined-ca-bundle\") pod \"43a601c8-c71a-41d8-b877-75ef0e2ac892\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.688881 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-config-data\") pod \"43a601c8-c71a-41d8-b877-75ef0e2ac892\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.688934 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-nova-metadata-tls-certs\") pod \"43a601c8-c71a-41d8-b877-75ef0e2ac892\" (UID: \"43a601c8-c71a-41d8-b877-75ef0e2ac892\") " Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.697025 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a601c8-c71a-41d8-b877-75ef0e2ac892-logs" (OuterVolumeSpecName: "logs") pod "43a601c8-c71a-41d8-b877-75ef0e2ac892" (UID: "43a601c8-c71a-41d8-b877-75ef0e2ac892"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.704053 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a601c8-c71a-41d8-b877-75ef0e2ac892-kube-api-access-bl55j" (OuterVolumeSpecName: "kube-api-access-bl55j") pod "43a601c8-c71a-41d8-b877-75ef0e2ac892" (UID: "43a601c8-c71a-41d8-b877-75ef0e2ac892"). InnerVolumeSpecName "kube-api-access-bl55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.733046 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43a601c8-c71a-41d8-b877-75ef0e2ac892" (UID: "43a601c8-c71a-41d8-b877-75ef0e2ac892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.754418 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-config-data" (OuterVolumeSpecName: "config-data") pod "43a601c8-c71a-41d8-b877-75ef0e2ac892" (UID: "43a601c8-c71a-41d8-b877-75ef0e2ac892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.786895 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "43a601c8-c71a-41d8-b877-75ef0e2ac892" (UID: "43a601c8-c71a-41d8-b877-75ef0e2ac892"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.790752 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.790779 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.790791 4886 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/43a601c8-c71a-41d8-b877-75ef0e2ac892-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.790803 4886 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a601c8-c71a-41d8-b877-75ef0e2ac892-logs\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.790813 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl55j\" (UniqueName: \"kubernetes.io/projected/43a601c8-c71a-41d8-b877-75ef0e2ac892-kube-api-access-bl55j\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.838238 4886 generic.go:334] "Generic (PLEG): container finished" podID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerID="2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e" exitCode=0 Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.839404 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.839719 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43a601c8-c71a-41d8-b877-75ef0e2ac892","Type":"ContainerDied","Data":"2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e"} Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.839779 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43a601c8-c71a-41d8-b877-75ef0e2ac892","Type":"ContainerDied","Data":"e49bd7f35982f0381fa4937d05d7d3cb0bdf0ff3040262385767f29074333c24"} Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.839799 4886 scope.go:117] "RemoveContainer" containerID="2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.882460 4886 scope.go:117] "RemoveContainer" containerID="a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.890523 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.911296 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.911890 4886 scope.go:117] "RemoveContainer" containerID="2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e" Mar 14 08:52:47 crc kubenswrapper[4886]: E0314 08:52:47.912289 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e\": container with ID starting with 2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e not found: ID does not exist" containerID="2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.912319 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e"} err="failed to get container status \"2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e\": rpc error: code = NotFound desc = could not find container \"2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e\": container with ID starting with 2a691de6262171fb9d7fce2e541695269898b84653e49f72e90e0a6a94fd3e0e not found: ID does not exist" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.912342 4886 scope.go:117] "RemoveContainer" containerID="a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f" Mar 14 08:52:47 crc kubenswrapper[4886]: E0314 08:52:47.912675 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f\": container with ID starting with a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f not found: ID does not exist" containerID="a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.912731 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f"} err="failed to get container status \"a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f\": rpc error: code = NotFound desc = could not find container \"a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f\": container with ID starting with a50341c7adca3532b22effb46bcacce9aed90f04e23384a5c2ed06b56bf4522f not found: ID does not exist" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.928608 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:47 crc kubenswrapper[4886]: E0314 08:52:47.929152 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-metadata" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.929176 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-metadata" Mar 14 08:52:47 crc kubenswrapper[4886]: E0314 08:52:47.929198 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-log" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.929206 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-log" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.929461 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-metadata" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.929490 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" containerName="nova-metadata-log" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.930616 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.936257 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.936452 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.940084 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.994514 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-config-data\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.994813 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907a09a-1258-4ed6-99a1-095050c8c378-logs\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.994890 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdmf\" (UniqueName: \"kubernetes.io/projected/d907a09a-1258-4ed6-99a1-095050c8c378-kube-api-access-5bdmf\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.995241 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:47 crc kubenswrapper[4886]: I0314 08:52:47.995526 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.098005 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.098369 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.098527 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-config-data\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.098698 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907a09a-1258-4ed6-99a1-095050c8c378-logs\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.098902 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdmf\" (UniqueName: \"kubernetes.io/projected/d907a09a-1258-4ed6-99a1-095050c8c378-kube-api-access-5bdmf\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.101503 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907a09a-1258-4ed6-99a1-095050c8c378-logs\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.104045 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.104194 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-config-data\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.104295 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907a09a-1258-4ed6-99a1-095050c8c378-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.114206 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdmf\" (UniqueName: \"kubernetes.io/projected/d907a09a-1258-4ed6-99a1-095050c8c378-kube-api-access-5bdmf\") pod \"nova-metadata-0\" (UID: \"d907a09a-1258-4ed6-99a1-095050c8c378\") " pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.270237 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.472620 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.608872 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2brj\" (UniqueName: \"kubernetes.io/projected/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-kube-api-access-q2brj\") pod \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.608923 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-config-data\") pod \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.609201 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-combined-ca-bundle\") pod \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\" (UID: \"0cc17f47-abd0-4f59-a626-d40a5a83f9cb\") " Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.617292 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-kube-api-access-q2brj" (OuterVolumeSpecName: "kube-api-access-q2brj") pod "0cc17f47-abd0-4f59-a626-d40a5a83f9cb" (UID: "0cc17f47-abd0-4f59-a626-d40a5a83f9cb"). InnerVolumeSpecName "kube-api-access-q2brj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.638693 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-config-data" (OuterVolumeSpecName: "config-data") pod "0cc17f47-abd0-4f59-a626-d40a5a83f9cb" (UID: "0cc17f47-abd0-4f59-a626-d40a5a83f9cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.651805 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cc17f47-abd0-4f59-a626-d40a5a83f9cb" (UID: "0cc17f47-abd0-4f59-a626-d40a5a83f9cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.712911 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2brj\" (UniqueName: \"kubernetes.io/projected/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-kube-api-access-q2brj\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.712946 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.712963 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc17f47-abd0-4f59-a626-d40a5a83f9cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.724985 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.853676 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d907a09a-1258-4ed6-99a1-095050c8c378","Type":"ContainerStarted","Data":"3735e48de5c891c0b30da857dd6e8998fef96e6f2a1f78401a23bb4ae8d89ed4"} Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.856008 4886 generic.go:334] "Generic (PLEG): container finished" podID="0cc17f47-abd0-4f59-a626-d40a5a83f9cb" containerID="32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856" exitCode=0 Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.856048 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc17f47-abd0-4f59-a626-d40a5a83f9cb","Type":"ContainerDied","Data":"32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856"} Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.856072 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0cc17f47-abd0-4f59-a626-d40a5a83f9cb","Type":"ContainerDied","Data":"342a598a112428f3a97b824cc1d76935741b30536283a75a907514acdb6175ce"} Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.856094 4886 scope.go:117] "RemoveContainer" containerID="32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.856301 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.883859 4886 scope.go:117] "RemoveContainer" containerID="32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856" Mar 14 08:52:48 crc kubenswrapper[4886]: E0314 08:52:48.886024 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856\": container with ID starting with 32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856 not found: ID does not exist" containerID="32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.886064 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856"} err="failed to get container status \"32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856\": rpc error: code = NotFound desc = could not find container \"32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856\": container with ID starting with 32661803f2b7965cbfdae0e522cc7c8bdbc9353b1148aba3b6f1082eb3d21856 not found: ID does not exist" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.891457 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.903257 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.923985 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:48 crc kubenswrapper[4886]: E0314 08:52:48.924648 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc17f47-abd0-4f59-a626-d40a5a83f9cb" containerName="nova-scheduler-scheduler" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.924670 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc17f47-abd0-4f59-a626-d40a5a83f9cb" containerName="nova-scheduler-scheduler" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.924946 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc17f47-abd0-4f59-a626-d40a5a83f9cb" containerName="nova-scheduler-scheduler" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.925793 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.928905 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 14 08:52:48 crc kubenswrapper[4886]: I0314 08:52:48.953387 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.018091 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh2d5\" (UniqueName: \"kubernetes.io/projected/488dc047-1241-4d80-bd37-245c32e9bdd2-kube-api-access-bh2d5\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.018272 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488dc047-1241-4d80-bd37-245c32e9bdd2-config-data\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.018441 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488dc047-1241-4d80-bd37-245c32e9bdd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.119837 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh2d5\" (UniqueName: \"kubernetes.io/projected/488dc047-1241-4d80-bd37-245c32e9bdd2-kube-api-access-bh2d5\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.119923 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488dc047-1241-4d80-bd37-245c32e9bdd2-config-data\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.120003 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488dc047-1241-4d80-bd37-245c32e9bdd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.123324 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488dc047-1241-4d80-bd37-245c32e9bdd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.124006 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488dc047-1241-4d80-bd37-245c32e9bdd2-config-data\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.137620 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh2d5\" (UniqueName: \"kubernetes.io/projected/488dc047-1241-4d80-bd37-245c32e9bdd2-kube-api-access-bh2d5\") pod \"nova-scheduler-0\" (UID: \"488dc047-1241-4d80-bd37-245c32e9bdd2\") " pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.261845 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.436457 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc17f47-abd0-4f59-a626-d40a5a83f9cb" path="/var/lib/kubelet/pods/0cc17f47-abd0-4f59-a626-d40a5a83f9cb/volumes" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.437458 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a601c8-c71a-41d8-b877-75ef0e2ac892" path="/var/lib/kubelet/pods/43a601c8-c71a-41d8-b877-75ef0e2ac892/volumes" Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.702136 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.865985 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"488dc047-1241-4d80-bd37-245c32e9bdd2","Type":"ContainerStarted","Data":"f3520a20fbe9acf9b1643c599657bf753652d31eaea3bc2ff6d098f8f6294808"} Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.868020 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d907a09a-1258-4ed6-99a1-095050c8c378","Type":"ContainerStarted","Data":"adaca51c64a5e636f6ce6f4634ce82f962a313a84381b05c26f0442f930bbe2b"} Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.868048 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d907a09a-1258-4ed6-99a1-095050c8c378","Type":"ContainerStarted","Data":"c4285286413d748663c604e98c8d274832995e81eae59f3a5aa6b7041cdf96f1"} Mar 14 08:52:49 crc kubenswrapper[4886]: I0314 08:52:49.894754 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.894734595 podStartE2EDuration="2.894734595s" podCreationTimestamp="2026-03-14 08:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:49.888455597 +0000 UTC m=+1505.136907234" watchObservedRunningTime="2026-03-14 08:52:49.894734595 +0000 UTC m=+1505.143186232" Mar 14 08:52:50 crc kubenswrapper[4886]: I0314 08:52:50.897352 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"488dc047-1241-4d80-bd37-245c32e9bdd2","Type":"ContainerStarted","Data":"09ea26283ddd543b9d6cd786e5ce2ff80820f1a193d4ac63e9c64dfe629f0be4"} Mar 14 08:52:50 crc kubenswrapper[4886]: I0314 08:52:50.922273 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.922252587 podStartE2EDuration="2.922252587s" podCreationTimestamp="2026-03-14 08:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:52:50.918207352 +0000 UTC m=+1506.166658999" watchObservedRunningTime="2026-03-14 08:52:50.922252587 +0000 UTC m=+1506.170704224" Mar 14 08:52:54 crc kubenswrapper[4886]: I0314 08:52:54.262444 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 14 08:52:55 crc kubenswrapper[4886]: I0314 08:52:55.177104 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 08:52:55 crc kubenswrapper[4886]: I0314 08:52:55.177205 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 14 08:52:56 crc kubenswrapper[4886]: I0314 08:52:56.066260 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:52:56 crc kubenswrapper[4886]: I0314 08:52:56.066612 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:52:56 crc kubenswrapper[4886]: I0314 08:52:56.199426 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4d9f64aa-3e9d-422f-a81e-a22d00914728" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:56 crc kubenswrapper[4886]: I0314 08:52:56.199465 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4d9f64aa-3e9d-422f-a81e-a22d00914728" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:58 crc kubenswrapper[4886]: I0314 08:52:58.270498 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 08:52:58 crc kubenswrapper[4886]: I0314 08:52:58.270901 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 14 08:52:59 crc kubenswrapper[4886]: I0314 08:52:59.262994 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 14 08:52:59 crc kubenswrapper[4886]: I0314 08:52:59.297376 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d907a09a-1258-4ed6-99a1-095050c8c378" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.236:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:59 crc kubenswrapper[4886]: I0314 08:52:59.297449 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d907a09a-1258-4ed6-99a1-095050c8c378" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.236:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 14 08:52:59 crc kubenswrapper[4886]: I0314 08:52:59.305930 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 14 08:53:00 crc kubenswrapper[4886]: I0314 08:53:00.005860 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 14 08:53:01 crc kubenswrapper[4886]: I0314 08:53:01.029063 4886 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 14 08:53:03 crc kubenswrapper[4886]: I0314 08:53:03.177165 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 08:53:03 crc kubenswrapper[4886]: I0314 08:53:03.177550 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 14 08:53:03 crc kubenswrapper[4886]: E0314 08:53:03.311646 4886 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/10c31567b7e26443349f74d793a31cbff7a5a14959855083f0af6a10a68b7f4c/diff" to get inode usage: stat /var/lib/containers/storage/overlay/10c31567b7e26443349f74d793a31cbff7a5a14959855083f0af6a10a68b7f4c/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_nova-metadata-0_43a601c8-c71a-41d8-b877-75ef0e2ac892/nova-metadata-metadata/0.log" to get inode usage: stat /var/log/pods/openstack_nova-metadata-0_43a601c8-c71a-41d8-b877-75ef0e2ac892/nova-metadata-metadata/0.log: no such file or directory Mar 14 08:53:04 crc kubenswrapper[4886]: E0314 08:53:04.333317 4886 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/01ad62b9be7b96cee1fd9232c394f9b803780316a8fa188a3e0f2ffdeabf9b17/diff" to get inode usage: stat /var/lib/containers/storage/overlay/01ad62b9be7b96cee1fd9232c394f9b803780316a8fa188a3e0f2ffdeabf9b17/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_nova-scheduler-0_0cc17f47-abd0-4f59-a626-d40a5a83f9cb/nova-scheduler-scheduler/0.log" to get inode usage: stat /var/log/pods/openstack_nova-scheduler-0_0cc17f47-abd0-4f59-a626-d40a5a83f9cb/nova-scheduler-scheduler/0.log: no such file or directory Mar 14 08:53:05 crc kubenswrapper[4886]: I0314 08:53:05.182725 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 08:53:05 crc kubenswrapper[4886]: I0314 08:53:05.183466 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 14 08:53:05 crc kubenswrapper[4886]: I0314 08:53:05.188684 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.031634 4886 generic.go:334] "Generic (PLEG): container finished" podID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerID="299e8007ec0ea2617cb03f2e9e5ec10edbfe3ca69ac94163cc4da94555aac5a6" exitCode=137 Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.033289 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerDied","Data":"299e8007ec0ea2617cb03f2e9e5ec10edbfe3ca69ac94163cc4da94555aac5a6"} Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.033320 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b","Type":"ContainerDied","Data":"b0ba87029f8ac7738246c892aa53908f85abdd84dee24fd8c0216c1578f5b2b2"} Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.033330 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0ba87029f8ac7738246c892aa53908f85abdd84dee24fd8c0216c1578f5b2b2" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.038538 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.140160 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191586 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-log-httpd\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191636 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-sg-core-conf-yaml\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191673 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-run-httpd\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191720 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-scripts\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191810 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-ceilometer-tls-certs\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191871 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn4t8\" (UniqueName: \"kubernetes.io/projected/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-kube-api-access-bn4t8\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191895 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-config-data\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.191972 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-combined-ca-bundle\") pod \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\" (UID: \"6dbe3aa4-6aa6-43bd-89ec-5e549667b02b\") " Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.192073 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.192631 4886 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.199443 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-scripts" (OuterVolumeSpecName: "scripts") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.200176 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.202578 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-kube-api-access-bn4t8" (OuterVolumeSpecName: "kube-api-access-bn4t8") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "kube-api-access-bn4t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.236267 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.270718 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.270794 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.289379 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.290600 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.294500 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn4t8\" (UniqueName: \"kubernetes.io/projected/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-kube-api-access-bn4t8\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.294522 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.294554 4886 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.294564 4886 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.294572 4886 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-scripts\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.294581 4886 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.339548 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-config-data" (OuterVolumeSpecName: "config-data") pod "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" (UID: "6dbe3aa4-6aa6-43bd-89ec-5e549667b02b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:06 crc kubenswrapper[4886]: I0314 08:53:06.398218 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.040880 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.079310 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.092665 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.104397 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:53:07 crc kubenswrapper[4886]: E0314 08:53:07.105328 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-central-agent" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.105399 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-central-agent" Mar 14 08:53:07 crc kubenswrapper[4886]: E0314 08:53:07.105473 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-notification-agent" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.105532 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-notification-agent" Mar 14 08:53:07 crc kubenswrapper[4886]: E0314 08:53:07.105586 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="sg-core" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.105634 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="sg-core" Mar 14 08:53:07 crc kubenswrapper[4886]: E0314 08:53:07.105705 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="proxy-httpd" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.105753 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="proxy-httpd" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.106041 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-notification-agent" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.106115 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="proxy-httpd" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.106198 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="sg-core" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.106266 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" containerName="ceilometer-central-agent" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.108427 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.111575 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.111818 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.116466 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.125807 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.216641 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.216700 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e173618-1706-4fef-937c-a6e2a1d5eb30-run-httpd\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.216743 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.216934 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e173618-1706-4fef-937c-a6e2a1d5eb30-log-httpd\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.217517 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jsp\" (UniqueName: \"kubernetes.io/projected/2e173618-1706-4fef-937c-a6e2a1d5eb30-kube-api-access-46jsp\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.217608 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-scripts\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.217859 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.218160 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-config-data\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.319782 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jsp\" (UniqueName: \"kubernetes.io/projected/2e173618-1706-4fef-937c-a6e2a1d5eb30-kube-api-access-46jsp\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.319842 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-scripts\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.319889 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.319940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-config-data\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.320006 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.320046 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e173618-1706-4fef-937c-a6e2a1d5eb30-run-httpd\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.320082 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.320107 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e173618-1706-4fef-937c-a6e2a1d5eb30-log-httpd\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.320675 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e173618-1706-4fef-937c-a6e2a1d5eb30-log-httpd\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.321032 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e173618-1706-4fef-937c-a6e2a1d5eb30-run-httpd\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.324890 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-scripts\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.325144 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-config-data\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.329164 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.329500 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.331451 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e173618-1706-4fef-937c-a6e2a1d5eb30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.340635 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jsp\" (UniqueName: \"kubernetes.io/projected/2e173618-1706-4fef-937c-a6e2a1d5eb30-kube-api-access-46jsp\") pod \"ceilometer-0\" (UID: \"2e173618-1706-4fef-937c-a6e2a1d5eb30\") " pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.429358 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.433447 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dbe3aa4-6aa6-43bd-89ec-5e549667b02b" path="/var/lib/kubelet/pods/6dbe3aa4-6aa6-43bd-89ec-5e549667b02b/volumes" Mar 14 08:53:07 crc kubenswrapper[4886]: I0314 08:53:07.859622 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 14 08:53:07 crc kubenswrapper[4886]: W0314 08:53:07.860777 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e173618_1706_4fef_937c_a6e2a1d5eb30.slice/crio-c0e4a42cb2385be6ea88a3af3002916406fa4c4bdacc38f938f084f19548cd99 WatchSource:0}: Error finding container c0e4a42cb2385be6ea88a3af3002916406fa4c4bdacc38f938f084f19548cd99: Status 404 returned error can't find the container with id c0e4a42cb2385be6ea88a3af3002916406fa4c4bdacc38f938f084f19548cd99 Mar 14 08:53:08 crc kubenswrapper[4886]: I0314 08:53:08.049730 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e173618-1706-4fef-937c-a6e2a1d5eb30","Type":"ContainerStarted","Data":"c0e4a42cb2385be6ea88a3af3002916406fa4c4bdacc38f938f084f19548cd99"} Mar 14 08:53:08 crc kubenswrapper[4886]: I0314 08:53:08.275615 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 08:53:08 crc kubenswrapper[4886]: I0314 08:53:08.279017 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 14 08:53:08 crc kubenswrapper[4886]: I0314 08:53:08.281922 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 08:53:08 crc kubenswrapper[4886]: I0314 08:53:08.580939 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:53:09 crc kubenswrapper[4886]: I0314 08:53:09.094526 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e173618-1706-4fef-937c-a6e2a1d5eb30","Type":"ContainerStarted","Data":"ce1958510962bdf1eae7bf25e9764169ada4e499edc1a799354d1f474903c5dc"} Mar 14 08:53:09 crc kubenswrapper[4886]: I0314 08:53:09.119669 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 14 08:53:10 crc kubenswrapper[4886]: I0314 08:53:10.102812 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e173618-1706-4fef-937c-a6e2a1d5eb30","Type":"ContainerStarted","Data":"d86117ea276aa589cf554fc23d3590f4327d64d2619d64fa4079653d9092c015"} Mar 14 08:53:10 crc kubenswrapper[4886]: I0314 08:53:10.103437 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e173618-1706-4fef-937c-a6e2a1d5eb30","Type":"ContainerStarted","Data":"4ae5655e594782de0ced0387026cb8971582cefe1e0371eccee6cc2d29bb077f"} Mar 14 08:53:12 crc kubenswrapper[4886]: I0314 08:53:12.129883 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e173618-1706-4fef-937c-a6e2a1d5eb30","Type":"ContainerStarted","Data":"756ae2a82b1e21f6297047122a5df69531f6dc7743ec1344aebd9160eedd7ccd"} Mar 14 08:53:12 crc kubenswrapper[4886]: I0314 08:53:12.130434 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 14 08:53:12 crc kubenswrapper[4886]: I0314 08:53:12.163951 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.52038552 podStartE2EDuration="5.163930201s" podCreationTimestamp="2026-03-14 08:53:07 +0000 UTC" firstStartedPulling="2026-03-14 08:53:07.862642597 +0000 UTC m=+1523.111094234" lastFinishedPulling="2026-03-14 08:53:11.506187278 +0000 UTC m=+1526.754638915" observedRunningTime="2026-03-14 08:53:12.161811931 +0000 UTC m=+1527.410263588" watchObservedRunningTime="2026-03-14 08:53:12.163930201 +0000 UTC m=+1527.412381848" Mar 14 08:53:26 crc kubenswrapper[4886]: I0314 08:53:26.065720 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:53:26 crc kubenswrapper[4886]: I0314 08:53:26.066226 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:53:37 crc kubenswrapper[4886]: I0314 08:53:37.438284 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.474286 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-knccv"] Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.478682 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.492551 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-knccv"] Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.632896 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-utilities\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.633654 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-catalog-content\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.633737 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4wx\" (UniqueName: \"kubernetes.io/projected/a54d38fd-6ef5-492e-b79d-3171c30d46e6-kube-api-access-vb4wx\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.736806 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-catalog-content\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.736940 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4wx\" (UniqueName: \"kubernetes.io/projected/a54d38fd-6ef5-492e-b79d-3171c30d46e6-kube-api-access-vb4wx\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.737351 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-utilities\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.737888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-catalog-content\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.738066 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-utilities\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.756546 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4wx\" (UniqueName: \"kubernetes.io/projected/a54d38fd-6ef5-492e-b79d-3171c30d46e6-kube-api-access-vb4wx\") pod \"certified-operators-knccv\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:42 crc kubenswrapper[4886]: I0314 08:53:42.840322 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:43 crc kubenswrapper[4886]: I0314 08:53:43.307529 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-knccv"] Mar 14 08:53:43 crc kubenswrapper[4886]: I0314 08:53:43.478407 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerStarted","Data":"41bf2e7e3cd3e39a1be733fec17111d36961e960ad4e7c7405b54bffc30aba63"} Mar 14 08:53:44 crc kubenswrapper[4886]: I0314 08:53:44.487899 4886 generic.go:334] "Generic (PLEG): container finished" podID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerID="6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22" exitCode=0 Mar 14 08:53:44 crc kubenswrapper[4886]: I0314 08:53:44.488012 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerDied","Data":"6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22"} Mar 14 08:53:45 crc kubenswrapper[4886]: I0314 08:53:45.505413 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerStarted","Data":"bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f"} Mar 14 08:53:46 crc kubenswrapper[4886]: I0314 08:53:46.497601 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:53:46 crc kubenswrapper[4886]: I0314 08:53:46.517930 4886 generic.go:334] "Generic (PLEG): container finished" podID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerID="bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f" exitCode=0 Mar 14 08:53:46 crc kubenswrapper[4886]: I0314 08:53:46.517977 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerDied","Data":"bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f"} Mar 14 08:53:47 crc kubenswrapper[4886]: I0314 08:53:47.528312 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerStarted","Data":"5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9"} Mar 14 08:53:47 crc kubenswrapper[4886]: I0314 08:53:47.548349 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-knccv" podStartSLOduration=3.024111791 podStartE2EDuration="5.548326093s" podCreationTimestamp="2026-03-14 08:53:42 +0000 UTC" firstStartedPulling="2026-03-14 08:53:44.490346697 +0000 UTC m=+1559.738798334" lastFinishedPulling="2026-03-14 08:53:47.014560999 +0000 UTC m=+1562.263012636" observedRunningTime="2026-03-14 08:53:47.54503843 +0000 UTC m=+1562.793490067" watchObservedRunningTime="2026-03-14 08:53:47.548326093 +0000 UTC m=+1562.796777740" Mar 14 08:53:47 crc kubenswrapper[4886]: I0314 08:53:47.760052 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:53:50 crc kubenswrapper[4886]: I0314 08:53:50.694349 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" containerName="rabbitmq" containerID="cri-o://93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0" gracePeriod=604796 Mar 14 08:53:51 crc kubenswrapper[4886]: I0314 08:53:51.704093 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" containerName="rabbitmq" containerID="cri-o://c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e" gracePeriod=604797 Mar 14 08:53:52 crc kubenswrapper[4886]: I0314 08:53:52.841061 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:52 crc kubenswrapper[4886]: I0314 08:53:52.841455 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:53:53 crc kubenswrapper[4886]: I0314 08:53:53.892459 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-knccv" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="registry-server" probeResult="failure" output=< Mar 14 08:53:53 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:53:53 crc kubenswrapper[4886]: > Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.066264 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.066530 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.066572 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.067295 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.067344 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" gracePeriod=600 Mar 14 08:53:56 crc kubenswrapper[4886]: E0314 08:53:56.194256 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.629571 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" exitCode=0 Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.629623 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc"} Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.629666 4886 scope.go:117] "RemoveContainer" containerID="55e54dc7fbdd549134d3b20dfd9642dda565b3dd8cfe4e3b853534c01d92f8db" Mar 14 08:53:56 crc kubenswrapper[4886]: I0314 08:53:56.630411 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:53:56 crc kubenswrapper[4886]: E0314 08:53:56.630782 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.246768 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.424976 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-server-conf\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.425266 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-confd\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.425359 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-tls\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.425482 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-kube-api-access-plz57\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.425882 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-plugins-conf\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.425953 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-erlang-cookie\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.426000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c08d9078-9b3a-492a-92db-3096453d49f8-erlang-cookie-secret\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.426029 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.426074 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-plugins\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.426096 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c08d9078-9b3a-492a-92db-3096453d49f8-pod-info\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.426192 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-config-data\") pod \"c08d9078-9b3a-492a-92db-3096453d49f8\" (UID: \"c08d9078-9b3a-492a-92db-3096453d49f8\") " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.426792 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.431279 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-kube-api-access-plz57" (OuterVolumeSpecName: "kube-api-access-plz57") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "kube-api-access-plz57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.431791 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.435270 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.435935 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.436061 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.439768 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c08d9078-9b3a-492a-92db-3096453d49f8-pod-info" (OuterVolumeSpecName: "pod-info") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.444849 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08d9078-9b3a-492a-92db-3096453d49f8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.467790 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-config-data" (OuterVolumeSpecName: "config-data") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.521804 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-server-conf" (OuterVolumeSpecName: "server-conf") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528662 4886 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528701 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528715 4886 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c08d9078-9b3a-492a-92db-3096453d49f8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528775 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528787 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528797 4886 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c08d9078-9b3a-492a-92db-3096453d49f8-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528808 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528817 4886 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c08d9078-9b3a-492a-92db-3096453d49f8-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528829 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.528839 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plz57\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-kube-api-access-plz57\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.555413 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.563789 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c08d9078-9b3a-492a-92db-3096453d49f8" (UID: "c08d9078-9b3a-492a-92db-3096453d49f8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.631503 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c08d9078-9b3a-492a-92db-3096453d49f8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.631544 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.650284 4886 generic.go:334] "Generic (PLEG): container finished" podID="c08d9078-9b3a-492a-92db-3096453d49f8" containerID="93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0" exitCode=0 Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.650355 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c08d9078-9b3a-492a-92db-3096453d49f8","Type":"ContainerDied","Data":"93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0"} Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.650379 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c08d9078-9b3a-492a-92db-3096453d49f8","Type":"ContainerDied","Data":"1c6276ee82d2427410054dd2bd2baf87d6a096eb614c1c7e7715844ad1c519f3"} Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.650395 4886 scope.go:117] "RemoveContainer" containerID="93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.650517 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.697766 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.710288 4886 scope.go:117] "RemoveContainer" containerID="548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.712176 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.728630 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:53:57 crc kubenswrapper[4886]: E0314 08:53:57.729065 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" containerName="setup-container" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.729083 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" containerName="setup-container" Mar 14 08:53:57 crc kubenswrapper[4886]: E0314 08:53:57.729096 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" containerName="rabbitmq" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.729102 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" containerName="rabbitmq" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.729352 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" containerName="rabbitmq" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.731050 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.733268 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.734349 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.735566 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.735996 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.736570 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qt8jt" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.736824 4886 scope.go:117] "RemoveContainer" containerID="93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.736966 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.737133 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 14 08:53:57 crc kubenswrapper[4886]: E0314 08:53:57.737783 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0\": container with ID starting with 93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0 not found: ID does not exist" containerID="93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.737820 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0"} err="failed to get container status \"93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0\": rpc error: code = NotFound desc = could not find container \"93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0\": container with ID starting with 93214e36e11f1eb8a0d13f7a772aeb46c4404d3e9853793a18721e8ec07b83d0 not found: ID does not exist" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.737851 4886 scope.go:117] "RemoveContainer" containerID="548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4" Mar 14 08:53:57 crc kubenswrapper[4886]: E0314 08:53:57.739757 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4\": container with ID starting with 548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4 not found: ID does not exist" containerID="548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.739794 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4"} err="failed to get container status \"548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4\": rpc error: code = NotFound desc = could not find container \"548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4\": container with ID starting with 548315cdb623465804b4f9fe0d140267259fbb4ce5ce07b497fa3080f29863c4 not found: ID does not exist" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.752687 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834205 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834271 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szvzf\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-kube-api-access-szvzf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834337 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834365 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834386 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834404 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834426 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834443 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834611 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.834672 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-config-data\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937046 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937139 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-config-data\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937186 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937230 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937840 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szvzf\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-kube-api-access-szvzf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937942 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.937981 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938012 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938101 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938149 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938179 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938191 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-config-data\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938150 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938456 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938690 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.938988 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.942902 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.942904 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.943237 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.944246 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.955520 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.957332 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szvzf\" (UniqueName: \"kubernetes.io/projected/4aa00f0b-8e91-4a74-88de-56f7ecf55ee5-kube-api-access-szvzf\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:57 crc kubenswrapper[4886]: I0314 08:53:57.980383 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5\") " pod="openstack/rabbitmq-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.053032 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.194632 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348562 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/68bf3729-3dcf-4881-814b-b6af3060336e-erlang-cookie-secret\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348657 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-erlang-cookie\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348707 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6hdj\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-kube-api-access-b6hdj\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348741 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-server-conf\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348796 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-plugins-conf\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348827 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-config-data\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348864 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-tls\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348881 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-confd\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348967 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/68bf3729-3dcf-4881-814b-b6af3060336e-pod-info\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.348981 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.349000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-plugins\") pod \"68bf3729-3dcf-4881-814b-b6af3060336e\" (UID: \"68bf3729-3dcf-4881-814b-b6af3060336e\") " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.349919 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.357400 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68bf3729-3dcf-4881-814b-b6af3060336e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.361284 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/68bf3729-3dcf-4881-814b-b6af3060336e-pod-info" (OuterVolumeSpecName: "pod-info") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.366285 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.370937 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-kube-api-access-b6hdj" (OuterVolumeSpecName: "kube-api-access-b6hdj") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "kube-api-access-b6hdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.372051 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.372075 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.388384 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.419761 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-config-data" (OuterVolumeSpecName: "config-data") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.454823 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-server-conf" (OuterVolumeSpecName: "server-conf") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457229 4886 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/68bf3729-3dcf-4881-814b-b6af3060336e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457317 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457333 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457349 4886 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/68bf3729-3dcf-4881-814b-b6af3060336e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457363 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457378 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6hdj\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-kube-api-access-b6hdj\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457389 4886 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457403 4886 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457415 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/68bf3729-3dcf-4881-814b-b6af3060336e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.457425 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.495698 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "68bf3729-3dcf-4881-814b-b6af3060336e" (UID: "68bf3729-3dcf-4881-814b-b6af3060336e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.502787 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.559064 4886 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/68bf3729-3dcf-4881-814b-b6af3060336e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.559090 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.666180 4886 generic.go:334] "Generic (PLEG): container finished" podID="68bf3729-3dcf-4881-814b-b6af3060336e" containerID="c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e" exitCode=0 Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.666227 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"68bf3729-3dcf-4881-814b-b6af3060336e","Type":"ContainerDied","Data":"c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e"} Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.666256 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"68bf3729-3dcf-4881-814b-b6af3060336e","Type":"ContainerDied","Data":"ccd29521a8afa6a332dbc78c9335df5371474150d3df80c1c2177a8d8f9800ec"} Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.666279 4886 scope.go:117] "RemoveContainer" containerID="c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.666453 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.667906 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 14 08:53:58 crc kubenswrapper[4886]: W0314 08:53:58.677107 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa00f0b_8e91_4a74_88de_56f7ecf55ee5.slice/crio-b117804e032db9dcd5713b77d2f8e8ea640815b359f9741fead3ad96f7c9a34f WatchSource:0}: Error finding container b117804e032db9dcd5713b77d2f8e8ea640815b359f9741fead3ad96f7c9a34f: Status 404 returned error can't find the container with id b117804e032db9dcd5713b77d2f8e8ea640815b359f9741fead3ad96f7c9a34f Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.705985 4886 scope.go:117] "RemoveContainer" containerID="70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.721478 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.731558 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.742626 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:53:58 crc kubenswrapper[4886]: E0314 08:53:58.743271 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" containerName="setup-container" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.743344 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" containerName="setup-container" Mar 14 08:53:58 crc kubenswrapper[4886]: E0314 08:53:58.743433 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" containerName="rabbitmq" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.743495 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" containerName="rabbitmq" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.743773 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" containerName="rabbitmq" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.745075 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750242 4886 scope.go:117] "RemoveContainer" containerID="c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750493 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750528 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750493 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750692 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750739 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750804 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.750891 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-n4g2l" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.752133 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:53:58 crc kubenswrapper[4886]: E0314 08:53:58.752304 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e\": container with ID starting with c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e not found: ID does not exist" containerID="c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.752339 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e"} err="failed to get container status \"c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e\": rpc error: code = NotFound desc = could not find container \"c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e\": container with ID starting with c44e45b8deb2b13c261bfa772151bb46377f4d546802bf618451e57dff7b084e not found: ID does not exist" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.752365 4886 scope.go:117] "RemoveContainer" containerID="70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da" Mar 14 08:53:58 crc kubenswrapper[4886]: E0314 08:53:58.752659 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da\": container with ID starting with 70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da not found: ID does not exist" containerID="70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.752687 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da"} err="failed to get container status \"70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da\": rpc error: code = NotFound desc = could not find container \"70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da\": container with ID starting with 70f613391bff57f10a7e19d0aa063591744b8d64a3857dfeaca9f66352cdf9da not found: ID does not exist" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.867714 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868099 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdggl\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-kube-api-access-vdggl\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868178 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f866eae5-fb12-4734-8906-aa868da61dd5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868203 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868243 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868266 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868344 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868373 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868423 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868503 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.868557 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f866eae5-fb12-4734-8906-aa868da61dd5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.969886 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970000 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970028 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970073 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970164 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970210 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f866eae5-fb12-4734-8906-aa868da61dd5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970253 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970294 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdggl\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-kube-api-access-vdggl\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970332 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f866eae5-fb12-4734-8906-aa868da61dd5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970356 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.970392 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.971416 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.971489 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.972630 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.972738 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.972940 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.973614 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f866eae5-fb12-4734-8906-aa868da61dd5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.977984 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f866eae5-fb12-4734-8906-aa868da61dd5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.988254 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.989712 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f866eae5-fb12-4734-8906-aa868da61dd5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:58 crc kubenswrapper[4886]: I0314 08:53:58.993790 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.001911 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdggl\" (UniqueName: \"kubernetes.io/projected/f866eae5-fb12-4734-8906-aa868da61dd5-kube-api-access-vdggl\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.033219 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f866eae5-fb12-4734-8906-aa868da61dd5\") " pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.085594 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.433696 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68bf3729-3dcf-4881-814b-b6af3060336e" path="/var/lib/kubelet/pods/68bf3729-3dcf-4881-814b-b6af3060336e/volumes" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.435252 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08d9078-9b3a-492a-92db-3096453d49f8" path="/var/lib/kubelet/pods/c08d9078-9b3a-492a-92db-3096453d49f8/volumes" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.609210 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.694400 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5","Type":"ContainerStarted","Data":"b117804e032db9dcd5713b77d2f8e8ea640815b359f9741fead3ad96f7c9a34f"} Mar 14 08:53:59 crc kubenswrapper[4886]: W0314 08:53:59.786394 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf866eae5_fb12_4734_8906_aa868da61dd5.slice/crio-d30863b736e88a7e09871d54c98148a0395ef3f1fbcbfa25c01dba0cf440a1ee WatchSource:0}: Error finding container d30863b736e88a7e09871d54c98148a0395ef3f1fbcbfa25c01dba0cf440a1ee: Status 404 returned error can't find the container with id d30863b736e88a7e09871d54c98148a0395ef3f1fbcbfa25c01dba0cf440a1ee Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.831355 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-xms44"] Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.834231 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.838419 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 14 08:53:59 crc kubenswrapper[4886]: I0314 08:53:59.852043 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-xms44"] Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.016907 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.016962 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.016995 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.017023 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.017145 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-svc\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.017225 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qlx\" (UniqueName: \"kubernetes.io/projected/ef9839b3-b43d-43d7-9441-8eb4ee763140-kube-api-access-q9qlx\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.017392 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-config\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.119850 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.119894 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.119924 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.119952 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.119975 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-svc\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.119998 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qlx\" (UniqueName: \"kubernetes.io/projected/ef9839b3-b43d-43d7-9441-8eb4ee763140-kube-api-access-q9qlx\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.120042 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-config\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.120990 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-config\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.121159 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.121165 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-svc\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.121319 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.121644 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.121678 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.133243 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557974-947z8"] Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.134573 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.137809 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.138027 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.138222 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.145215 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qlx\" (UniqueName: \"kubernetes.io/projected/ef9839b3-b43d-43d7-9441-8eb4ee763140-kube-api-access-q9qlx\") pod \"dnsmasq-dns-5576978c7c-xms44\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.145587 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-947z8"] Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.221940 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxn2w\" (UniqueName: \"kubernetes.io/projected/6f1282d9-39e2-4cda-9431-a984056855f2-kube-api-access-sxn2w\") pod \"auto-csr-approver-29557974-947z8\" (UID: \"6f1282d9-39e2-4cda-9431-a984056855f2\") " pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.324417 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxn2w\" (UniqueName: \"kubernetes.io/projected/6f1282d9-39e2-4cda-9431-a984056855f2-kube-api-access-sxn2w\") pod \"auto-csr-approver-29557974-947z8\" (UID: \"6f1282d9-39e2-4cda-9431-a984056855f2\") " pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.349981 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxn2w\" (UniqueName: \"kubernetes.io/projected/6f1282d9-39e2-4cda-9431-a984056855f2-kube-api-access-sxn2w\") pod \"auto-csr-approver-29557974-947z8\" (UID: \"6f1282d9-39e2-4cda-9431-a984056855f2\") " pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.390971 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.504613 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.708580 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5","Type":"ContainerStarted","Data":"70b56fc3d9d7c08afba6a021e33ea6805d2c8d84af82b55fbc496a134db4aeff"} Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.713435 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f866eae5-fb12-4734-8906-aa868da61dd5","Type":"ContainerStarted","Data":"d30863b736e88a7e09871d54c98148a0395ef3f1fbcbfa25c01dba0cf440a1ee"} Mar 14 08:54:00 crc kubenswrapper[4886]: I0314 08:54:00.895460 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-xms44"] Mar 14 08:54:00 crc kubenswrapper[4886]: W0314 08:54:00.983301 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9839b3_b43d_43d7_9441_8eb4ee763140.slice/crio-9cfd6279ddaee52ea87b8f3a0dbbc60027b6e42092fd3b4dec480493d861813a WatchSource:0}: Error finding container 9cfd6279ddaee52ea87b8f3a0dbbc60027b6e42092fd3b4dec480493d861813a: Status 404 returned error can't find the container with id 9cfd6279ddaee52ea87b8f3a0dbbc60027b6e42092fd3b4dec480493d861813a Mar 14 08:54:01 crc kubenswrapper[4886]: I0314 08:54:01.020093 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-947z8"] Mar 14 08:54:01 crc kubenswrapper[4886]: W0314 08:54:01.021272 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f1282d9_39e2_4cda_9431_a984056855f2.slice/crio-07219c1fd36f491e35657dce5c580e74856e05a6dafcd4b732b5e189244fdedd WatchSource:0}: Error finding container 07219c1fd36f491e35657dce5c580e74856e05a6dafcd4b732b5e189244fdedd: Status 404 returned error can't find the container with id 07219c1fd36f491e35657dce5c580e74856e05a6dafcd4b732b5e189244fdedd Mar 14 08:54:01 crc kubenswrapper[4886]: I0314 08:54:01.725235 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557974-947z8" event={"ID":"6f1282d9-39e2-4cda-9431-a984056855f2","Type":"ContainerStarted","Data":"07219c1fd36f491e35657dce5c580e74856e05a6dafcd4b732b5e189244fdedd"} Mar 14 08:54:01 crc kubenswrapper[4886]: I0314 08:54:01.726641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f866eae5-fb12-4734-8906-aa868da61dd5","Type":"ContainerStarted","Data":"0d3860232c3ba612d475bec468defbd113960303490ab88ff5b8a13dbf5119ff"} Mar 14 08:54:01 crc kubenswrapper[4886]: I0314 08:54:01.728086 4886 generic.go:334] "Generic (PLEG): container finished" podID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerID="423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2" exitCode=0 Mar 14 08:54:01 crc kubenswrapper[4886]: I0314 08:54:01.728168 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-xms44" event={"ID":"ef9839b3-b43d-43d7-9441-8eb4ee763140","Type":"ContainerDied","Data":"423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2"} Mar 14 08:54:01 crc kubenswrapper[4886]: I0314 08:54:01.728198 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-xms44" event={"ID":"ef9839b3-b43d-43d7-9441-8eb4ee763140","Type":"ContainerStarted","Data":"9cfd6279ddaee52ea87b8f3a0dbbc60027b6e42092fd3b4dec480493d861813a"} Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.738070 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-xms44" event={"ID":"ef9839b3-b43d-43d7-9441-8eb4ee763140","Type":"ContainerStarted","Data":"b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55"} Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.738552 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.740818 4886 generic.go:334] "Generic (PLEG): container finished" podID="6f1282d9-39e2-4cda-9431-a984056855f2" containerID="954f68258749ef53aae9d9c9abb0b484f61d0fb7c2d1497060257441212f0217" exitCode=0 Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.740872 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557974-947z8" event={"ID":"6f1282d9-39e2-4cda-9431-a984056855f2","Type":"ContainerDied","Data":"954f68258749ef53aae9d9c9abb0b484f61d0fb7c2d1497060257441212f0217"} Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.761700 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-xms44" podStartSLOduration=3.761680203 podStartE2EDuration="3.761680203s" podCreationTimestamp="2026-03-14 08:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:02.754334474 +0000 UTC m=+1578.002786111" watchObservedRunningTime="2026-03-14 08:54:02.761680203 +0000 UTC m=+1578.010131840" Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.891453 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:54:02 crc kubenswrapper[4886]: I0314 08:54:02.940778 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:54:03 crc kubenswrapper[4886]: I0314 08:54:03.129481 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-knccv"] Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.148401 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.211094 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxn2w\" (UniqueName: \"kubernetes.io/projected/6f1282d9-39e2-4cda-9431-a984056855f2-kube-api-access-sxn2w\") pod \"6f1282d9-39e2-4cda-9431-a984056855f2\" (UID: \"6f1282d9-39e2-4cda-9431-a984056855f2\") " Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.231185 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1282d9-39e2-4cda-9431-a984056855f2-kube-api-access-sxn2w" (OuterVolumeSpecName: "kube-api-access-sxn2w") pod "6f1282d9-39e2-4cda-9431-a984056855f2" (UID: "6f1282d9-39e2-4cda-9431-a984056855f2"). InnerVolumeSpecName "kube-api-access-sxn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.313899 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxn2w\" (UniqueName: \"kubernetes.io/projected/6f1282d9-39e2-4cda-9431-a984056855f2-kube-api-access-sxn2w\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.761100 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-knccv" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="registry-server" containerID="cri-o://5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9" gracePeriod=2 Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.761433 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557974-947z8" Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.762335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557974-947z8" event={"ID":"6f1282d9-39e2-4cda-9431-a984056855f2","Type":"ContainerDied","Data":"07219c1fd36f491e35657dce5c580e74856e05a6dafcd4b732b5e189244fdedd"} Mar 14 08:54:04 crc kubenswrapper[4886]: I0314 08:54:04.762455 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07219c1fd36f491e35657dce5c580e74856e05a6dafcd4b732b5e189244fdedd" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.220634 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-7qmjv"] Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.237447 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557968-7qmjv"] Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.271380 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.331324 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb4wx\" (UniqueName: \"kubernetes.io/projected/a54d38fd-6ef5-492e-b79d-3171c30d46e6-kube-api-access-vb4wx\") pod \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.331398 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-catalog-content\") pod \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.331453 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-utilities\") pod \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\" (UID: \"a54d38fd-6ef5-492e-b79d-3171c30d46e6\") " Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.332901 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-utilities" (OuterVolumeSpecName: "utilities") pod "a54d38fd-6ef5-492e-b79d-3171c30d46e6" (UID: "a54d38fd-6ef5-492e-b79d-3171c30d46e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.340035 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54d38fd-6ef5-492e-b79d-3171c30d46e6-kube-api-access-vb4wx" (OuterVolumeSpecName: "kube-api-access-vb4wx") pod "a54d38fd-6ef5-492e-b79d-3171c30d46e6" (UID: "a54d38fd-6ef5-492e-b79d-3171c30d46e6"). InnerVolumeSpecName "kube-api-access-vb4wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.392570 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a54d38fd-6ef5-492e-b79d-3171c30d46e6" (UID: "a54d38fd-6ef5-492e-b79d-3171c30d46e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.432650 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710fbba0-b6e6-4d2e-a3be-f66a11491a0f" path="/var/lib/kubelet/pods/710fbba0-b6e6-4d2e-a3be-f66a11491a0f/volumes" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.434198 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb4wx\" (UniqueName: \"kubernetes.io/projected/a54d38fd-6ef5-492e-b79d-3171c30d46e6-kube-api-access-vb4wx\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.434297 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.434367 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a54d38fd-6ef5-492e-b79d-3171c30d46e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.772841 4886 generic.go:334] "Generic (PLEG): container finished" podID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerID="5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9" exitCode=0 Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.772919 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-knccv" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.772935 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerDied","Data":"5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9"} Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.773203 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-knccv" event={"ID":"a54d38fd-6ef5-492e-b79d-3171c30d46e6","Type":"ContainerDied","Data":"41bf2e7e3cd3e39a1be733fec17111d36961e960ad4e7c7405b54bffc30aba63"} Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.773228 4886 scope.go:117] "RemoveContainer" containerID="5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.794752 4886 scope.go:117] "RemoveContainer" containerID="bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.796600 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-knccv"] Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.806645 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-knccv"] Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.817464 4886 scope.go:117] "RemoveContainer" containerID="6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.857580 4886 scope.go:117] "RemoveContainer" containerID="5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9" Mar 14 08:54:05 crc kubenswrapper[4886]: E0314 08:54:05.858099 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9\": container with ID starting with 5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9 not found: ID does not exist" containerID="5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.858183 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9"} err="failed to get container status \"5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9\": rpc error: code = NotFound desc = could not find container \"5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9\": container with ID starting with 5989e3c8fd08e8d1ad4a4cd088db43f119e6690d7f04434994da586ee29720c9 not found: ID does not exist" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.858211 4886 scope.go:117] "RemoveContainer" containerID="bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f" Mar 14 08:54:05 crc kubenswrapper[4886]: E0314 08:54:05.858671 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f\": container with ID starting with bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f not found: ID does not exist" containerID="bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.858716 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f"} err="failed to get container status \"bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f\": rpc error: code = NotFound desc = could not find container \"bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f\": container with ID starting with bf7b2d01bebc48626522d1ec283ab261a629e642665dc9b322d9179acd297d7f not found: ID does not exist" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.858747 4886 scope.go:117] "RemoveContainer" containerID="6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22" Mar 14 08:54:05 crc kubenswrapper[4886]: E0314 08:54:05.859028 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22\": container with ID starting with 6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22 not found: ID does not exist" containerID="6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22" Mar 14 08:54:05 crc kubenswrapper[4886]: I0314 08:54:05.859056 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22"} err="failed to get container status \"6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22\": rpc error: code = NotFound desc = could not find container \"6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22\": container with ID starting with 6de84b62ab4d97bcba4bfd8e2d102a74b325c871bcbb399a96c0278eb3658e22 not found: ID does not exist" Mar 14 08:54:07 crc kubenswrapper[4886]: I0314 08:54:07.420631 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:54:07 crc kubenswrapper[4886]: E0314 08:54:07.420935 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:54:07 crc kubenswrapper[4886]: I0314 08:54:07.445923 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" path="/var/lib/kubelet/pods/a54d38fd-6ef5-492e-b79d-3171c30d46e6/volumes" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.392273 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.457048 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-55dtq"] Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.457412 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerName="dnsmasq-dns" containerID="cri-o://c5389940f167ee63e60741f69efef5ec57b5085d1e7db0fa8a7f377465cdf669" gracePeriod=10 Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.584686 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56f7ccd8f7-hk99n"] Mar 14 08:54:10 crc kubenswrapper[4886]: E0314 08:54:10.585112 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="extract-utilities" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.585141 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="extract-utilities" Mar 14 08:54:10 crc kubenswrapper[4886]: E0314 08:54:10.585161 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="extract-content" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.585167 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="extract-content" Mar 14 08:54:10 crc kubenswrapper[4886]: E0314 08:54:10.585182 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="registry-server" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.585188 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="registry-server" Mar 14 08:54:10 crc kubenswrapper[4886]: E0314 08:54:10.585200 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1282d9-39e2-4cda-9431-a984056855f2" containerName="oc" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.585206 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1282d9-39e2-4cda-9431-a984056855f2" containerName="oc" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.585377 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1282d9-39e2-4cda-9431-a984056855f2" containerName="oc" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.585398 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54d38fd-6ef5-492e-b79d-3171c30d46e6" containerName="registry-server" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.586390 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.615053 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f7ccd8f7-hk99n"] Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685511 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-ovsdbserver-sb\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685597 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-dns-svc\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685623 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8m6t\" (UniqueName: \"kubernetes.io/projected/dc84b7fd-15aa-4477-9145-f97680b55a4b-kube-api-access-h8m6t\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685650 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-openstack-edpm-ipam\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685667 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-config\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685689 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.685709 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-dns-swift-storage-0\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789160 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-ovsdbserver-sb\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789253 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-dns-svc\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789290 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8m6t\" (UniqueName: \"kubernetes.io/projected/dc84b7fd-15aa-4477-9145-f97680b55a4b-kube-api-access-h8m6t\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789322 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-openstack-edpm-ipam\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789342 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-config\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789364 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.789382 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-dns-swift-storage-0\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.790601 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-dns-swift-storage-0\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.791019 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-ovsdbserver-sb\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.791139 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-openstack-edpm-ipam\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.794727 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-config\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.794897 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.806550 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc84b7fd-15aa-4477-9145-f97680b55a4b-dns-svc\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.864185 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8m6t\" (UniqueName: \"kubernetes.io/projected/dc84b7fd-15aa-4477-9145-f97680b55a4b-kube-api-access-h8m6t\") pod \"dnsmasq-dns-56f7ccd8f7-hk99n\" (UID: \"dc84b7fd-15aa-4477-9145-f97680b55a4b\") " pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.869261 4886 generic.go:334] "Generic (PLEG): container finished" podID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerID="c5389940f167ee63e60741f69efef5ec57b5085d1e7db0fa8a7f377465cdf669" exitCode=0 Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.869299 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" event={"ID":"ec356daa-f053-45d9-8297-1df7fa8621a0","Type":"ContainerDied","Data":"c5389940f167ee63e60741f69efef5ec57b5085d1e7db0fa8a7f377465cdf669"} Mar 14 08:54:10 crc kubenswrapper[4886]: I0314 08:54:10.992410 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.160152 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.297284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-svc\") pod \"ec356daa-f053-45d9-8297-1df7fa8621a0\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.297405 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-nb\") pod \"ec356daa-f053-45d9-8297-1df7fa8621a0\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.297611 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-config\") pod \"ec356daa-f053-45d9-8297-1df7fa8621a0\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.297666 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-swift-storage-0\") pod \"ec356daa-f053-45d9-8297-1df7fa8621a0\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.297701 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lh79\" (UniqueName: \"kubernetes.io/projected/ec356daa-f053-45d9-8297-1df7fa8621a0-kube-api-access-8lh79\") pod \"ec356daa-f053-45d9-8297-1df7fa8621a0\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.297723 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-sb\") pod \"ec356daa-f053-45d9-8297-1df7fa8621a0\" (UID: \"ec356daa-f053-45d9-8297-1df7fa8621a0\") " Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.306096 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec356daa-f053-45d9-8297-1df7fa8621a0-kube-api-access-8lh79" (OuterVolumeSpecName: "kube-api-access-8lh79") pod "ec356daa-f053-45d9-8297-1df7fa8621a0" (UID: "ec356daa-f053-45d9-8297-1df7fa8621a0"). InnerVolumeSpecName "kube-api-access-8lh79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.357445 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec356daa-f053-45d9-8297-1df7fa8621a0" (UID: "ec356daa-f053-45d9-8297-1df7fa8621a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.362827 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-config" (OuterVolumeSpecName: "config") pod "ec356daa-f053-45d9-8297-1df7fa8621a0" (UID: "ec356daa-f053-45d9-8297-1df7fa8621a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.368472 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec356daa-f053-45d9-8297-1df7fa8621a0" (UID: "ec356daa-f053-45d9-8297-1df7fa8621a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.371433 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec356daa-f053-45d9-8297-1df7fa8621a0" (UID: "ec356daa-f053-45d9-8297-1df7fa8621a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.380583 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ec356daa-f053-45d9-8297-1df7fa8621a0" (UID: "ec356daa-f053-45d9-8297-1df7fa8621a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.404992 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.405043 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.405056 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lh79\" (UniqueName: \"kubernetes.io/projected/ec356daa-f053-45d9-8297-1df7fa8621a0-kube-api-access-8lh79\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.405067 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.405078 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.405090 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec356daa-f053-45d9-8297-1df7fa8621a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.511084 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f7ccd8f7-hk99n"] Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.886638 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.886592 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-55dtq" event={"ID":"ec356daa-f053-45d9-8297-1df7fa8621a0","Type":"ContainerDied","Data":"2173156f96691102c2fa4fbd0b0223d4a2638bb221e1b61c3491c9e89d4f3f79"} Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.887211 4886 scope.go:117] "RemoveContainer" containerID="c5389940f167ee63e60741f69efef5ec57b5085d1e7db0fa8a7f377465cdf669" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.890435 4886 generic.go:334] "Generic (PLEG): container finished" podID="dc84b7fd-15aa-4477-9145-f97680b55a4b" containerID="76f2ef35f733e6cb9b126cf92ab05e0a3cc0aaf1b15ca6514d99e2b5d6444ab9" exitCode=0 Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.890508 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" event={"ID":"dc84b7fd-15aa-4477-9145-f97680b55a4b","Type":"ContainerDied","Data":"76f2ef35f733e6cb9b126cf92ab05e0a3cc0aaf1b15ca6514d99e2b5d6444ab9"} Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.890560 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" event={"ID":"dc84b7fd-15aa-4477-9145-f97680b55a4b","Type":"ContainerStarted","Data":"92c8f3ab7f3db59222241242c08c3062cdbaa0cd3f553ce86323a63745499c8d"} Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.929305 4886 scope.go:117] "RemoveContainer" containerID="1efd48189dec685f2c073f2a412ef204b69509bd12949b87bac70fb1aaeae73b" Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.930175 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-55dtq"] Mar 14 08:54:11 crc kubenswrapper[4886]: I0314 08:54:11.943097 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-55dtq"] Mar 14 08:54:12 crc kubenswrapper[4886]: I0314 08:54:12.901276 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" event={"ID":"dc84b7fd-15aa-4477-9145-f97680b55a4b","Type":"ContainerStarted","Data":"cd68c8ea7d71b4449a00720debc7decae788a64c2aa0e21c200a15ea19bd669d"} Mar 14 08:54:12 crc kubenswrapper[4886]: I0314 08:54:12.901817 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:12 crc kubenswrapper[4886]: I0314 08:54:12.927968 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" podStartSLOduration=2.927946695 podStartE2EDuration="2.927946695s" podCreationTimestamp="2026-03-14 08:54:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:12.921174453 +0000 UTC m=+1588.169626110" watchObservedRunningTime="2026-03-14 08:54:12.927946695 +0000 UTC m=+1588.176398332" Mar 14 08:54:13 crc kubenswrapper[4886]: I0314 08:54:13.431632 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" path="/var/lib/kubelet/pods/ec356daa-f053-45d9-8297-1df7fa8621a0/volumes" Mar 14 08:54:20 crc kubenswrapper[4886]: I0314 08:54:20.995304 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56f7ccd8f7-hk99n" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.104771 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-xms44"] Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.105003 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-xms44" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerName="dnsmasq-dns" containerID="cri-o://b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55" gracePeriod=10 Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.650055 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737408 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-config\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737519 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-nb\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737627 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-svc\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737707 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-sb\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737784 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qlx\" (UniqueName: \"kubernetes.io/projected/ef9839b3-b43d-43d7-9441-8eb4ee763140-kube-api-access-q9qlx\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737813 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-openstack-edpm-ipam\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.737843 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-swift-storage-0\") pod \"ef9839b3-b43d-43d7-9441-8eb4ee763140\" (UID: \"ef9839b3-b43d-43d7-9441-8eb4ee763140\") " Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.746409 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9839b3-b43d-43d7-9441-8eb4ee763140-kube-api-access-q9qlx" (OuterVolumeSpecName: "kube-api-access-q9qlx") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "kube-api-access-q9qlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.801917 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.802809 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.807944 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-config" (OuterVolumeSpecName: "config") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.809691 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.812953 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.818335 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef9839b3-b43d-43d7-9441-8eb4ee763140" (UID: "ef9839b3-b43d-43d7-9441-8eb4ee763140"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840419 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qlx\" (UniqueName: \"kubernetes.io/projected/ef9839b3-b43d-43d7-9441-8eb4ee763140-kube-api-access-q9qlx\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840458 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840471 4886 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840484 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-config\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840497 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840508 4886 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:21 crc kubenswrapper[4886]: I0314 08:54:21.840521 4886 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef9839b3-b43d-43d7-9441-8eb4ee763140-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.009714 4886 generic.go:334] "Generic (PLEG): container finished" podID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerID="b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55" exitCode=0 Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.009760 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-xms44" event={"ID":"ef9839b3-b43d-43d7-9441-8eb4ee763140","Type":"ContainerDied","Data":"b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55"} Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.009786 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-xms44" event={"ID":"ef9839b3-b43d-43d7-9441-8eb4ee763140","Type":"ContainerDied","Data":"9cfd6279ddaee52ea87b8f3a0dbbc60027b6e42092fd3b4dec480493d861813a"} Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.009804 4886 scope.go:117] "RemoveContainer" containerID="b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.009856 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-xms44" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.034209 4886 scope.go:117] "RemoveContainer" containerID="423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.058655 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-xms44"] Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.069521 4886 scope.go:117] "RemoveContainer" containerID="b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55" Mar 14 08:54:22 crc kubenswrapper[4886]: E0314 08:54:22.070040 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55\": container with ID starting with b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55 not found: ID does not exist" containerID="b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.070083 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55"} err="failed to get container status \"b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55\": rpc error: code = NotFound desc = could not find container \"b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55\": container with ID starting with b0977b11146607edc5ce4a1526e5eb13f90e053938833c2f7231f72d55caea55 not found: ID does not exist" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.070112 4886 scope.go:117] "RemoveContainer" containerID="423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2" Mar 14 08:54:22 crc kubenswrapper[4886]: E0314 08:54:22.070487 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2\": container with ID starting with 423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2 not found: ID does not exist" containerID="423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.070546 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2"} err="failed to get container status \"423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2\": rpc error: code = NotFound desc = could not find container \"423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2\": container with ID starting with 423da2072a4cdff0c471de12c984cae65d65572d50f90e2dcf404918f013dff2 not found: ID does not exist" Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.070651 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-xms44"] Mar 14 08:54:22 crc kubenswrapper[4886]: I0314 08:54:22.421131 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:54:22 crc kubenswrapper[4886]: E0314 08:54:22.421756 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:54:23 crc kubenswrapper[4886]: I0314 08:54:23.430588 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" path="/var/lib/kubelet/pods/ef9839b3-b43d-43d7-9441-8eb4ee763140/volumes" Mar 14 08:54:30 crc kubenswrapper[4886]: I0314 08:54:30.962040 4886 scope.go:117] "RemoveContainer" containerID="c9b740d8c7aa7c16457126801f3f65871f7a4b14a3002288645d97734857c477" Mar 14 08:54:33 crc kubenswrapper[4886]: I0314 08:54:33.127571 4886 generic.go:334] "Generic (PLEG): container finished" podID="4aa00f0b-8e91-4a74-88de-56f7ecf55ee5" containerID="70b56fc3d9d7c08afba6a021e33ea6805d2c8d84af82b55fbc496a134db4aeff" exitCode=0 Mar 14 08:54:33 crc kubenswrapper[4886]: I0314 08:54:33.127687 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5","Type":"ContainerDied","Data":"70b56fc3d9d7c08afba6a021e33ea6805d2c8d84af82b55fbc496a134db4aeff"} Mar 14 08:54:34 crc kubenswrapper[4886]: I0314 08:54:34.150303 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4aa00f0b-8e91-4a74-88de-56f7ecf55ee5","Type":"ContainerStarted","Data":"c3948564d0e0a78c54666bdacf125d41c2b133c34d4309f5c43a3ba6f7f6fb66"} Mar 14 08:54:34 crc kubenswrapper[4886]: I0314 08:54:34.151339 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 14 08:54:34 crc kubenswrapper[4886]: I0314 08:54:34.153377 4886 generic.go:334] "Generic (PLEG): container finished" podID="f866eae5-fb12-4734-8906-aa868da61dd5" containerID="0d3860232c3ba612d475bec468defbd113960303490ab88ff5b8a13dbf5119ff" exitCode=0 Mar 14 08:54:34 crc kubenswrapper[4886]: I0314 08:54:34.153426 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f866eae5-fb12-4734-8906-aa868da61dd5","Type":"ContainerDied","Data":"0d3860232c3ba612d475bec468defbd113960303490ab88ff5b8a13dbf5119ff"} Mar 14 08:54:34 crc kubenswrapper[4886]: I0314 08:54:34.204507 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.204476069 podStartE2EDuration="37.204476069s" podCreationTimestamp="2026-03-14 08:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:34.195939946 +0000 UTC m=+1609.444391663" watchObservedRunningTime="2026-03-14 08:54:34.204476069 +0000 UTC m=+1609.452927716" Mar 14 08:54:35 crc kubenswrapper[4886]: I0314 08:54:35.164081 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f866eae5-fb12-4734-8906-aa868da61dd5","Type":"ContainerStarted","Data":"686e9829de6254d9995d123bae09fe0a8da8d759b7776156fbd15b848f9d2aa5"} Mar 14 08:54:35 crc kubenswrapper[4886]: I0314 08:54:35.164689 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:54:35 crc kubenswrapper[4886]: I0314 08:54:35.193665 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.193645602 podStartE2EDuration="37.193645602s" podCreationTimestamp="2026-03-14 08:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 08:54:35.183908705 +0000 UTC m=+1610.432360352" watchObservedRunningTime="2026-03-14 08:54:35.193645602 +0000 UTC m=+1610.442097239" Mar 14 08:54:36 crc kubenswrapper[4886]: I0314 08:54:36.420431 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:54:36 crc kubenswrapper[4886]: E0314 08:54:36.420746 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.346108 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz"] Mar 14 08:54:41 crc kubenswrapper[4886]: E0314 08:54:41.347815 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerName="init" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.347916 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerName="init" Mar 14 08:54:41 crc kubenswrapper[4886]: E0314 08:54:41.347987 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerName="init" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.348044 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerName="init" Mar 14 08:54:41 crc kubenswrapper[4886]: E0314 08:54:41.348102 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerName="dnsmasq-dns" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.348173 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerName="dnsmasq-dns" Mar 14 08:54:41 crc kubenswrapper[4886]: E0314 08:54:41.348239 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerName="dnsmasq-dns" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.348291 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerName="dnsmasq-dns" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.348508 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec356daa-f053-45d9-8297-1df7fa8621a0" containerName="dnsmasq-dns" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.350757 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9839b3-b43d-43d7-9441-8eb4ee763140" containerName="dnsmasq-dns" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.351528 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.353262 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.357765 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.357835 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.367926 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.373856 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz"] Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.527384 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.527568 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.527592 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.527631 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrbp\" (UniqueName: \"kubernetes.io/projected/a9632d51-2405-4118-a547-fc6a0e6e5c42-kube-api-access-rbrbp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.631107 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrbp\" (UniqueName: \"kubernetes.io/projected/a9632d51-2405-4118-a547-fc6a0e6e5c42-kube-api-access-rbrbp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.631228 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.631330 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.631362 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.637355 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.638597 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.640104 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.657450 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrbp\" (UniqueName: \"kubernetes.io/projected/a9632d51-2405-4118-a547-fc6a0e6e5c42-kube-api-access-rbrbp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:41 crc kubenswrapper[4886]: I0314 08:54:41.669252 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:54:42 crc kubenswrapper[4886]: I0314 08:54:42.224436 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz"] Mar 14 08:54:43 crc kubenswrapper[4886]: I0314 08:54:43.240429 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" event={"ID":"a9632d51-2405-4118-a547-fc6a0e6e5c42","Type":"ContainerStarted","Data":"661d05c92562e17084375c508cf0cb135ee669cfb07f45f04712b2dc96441e50"} Mar 14 08:54:47 crc kubenswrapper[4886]: I0314 08:54:47.421453 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:54:47 crc kubenswrapper[4886]: E0314 08:54:47.422168 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:54:48 crc kubenswrapper[4886]: I0314 08:54:48.056295 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 14 08:54:49 crc kubenswrapper[4886]: I0314 08:54:49.090050 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 14 08:54:51 crc kubenswrapper[4886]: I0314 08:54:51.751354 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 08:54:52 crc kubenswrapper[4886]: I0314 08:54:52.331720 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" event={"ID":"a9632d51-2405-4118-a547-fc6a0e6e5c42","Type":"ContainerStarted","Data":"163614a5306d6ab5c0569e211d1ef9f6cd9618070883a26d480217a0cbf645d4"} Mar 14 08:54:52 crc kubenswrapper[4886]: I0314 08:54:52.350934 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" podStartSLOduration=1.8304564129999998 podStartE2EDuration="11.350916432s" podCreationTimestamp="2026-03-14 08:54:41 +0000 UTC" firstStartedPulling="2026-03-14 08:54:42.228594335 +0000 UTC m=+1617.477045982" lastFinishedPulling="2026-03-14 08:54:51.749054364 +0000 UTC m=+1626.997506001" observedRunningTime="2026-03-14 08:54:52.348406921 +0000 UTC m=+1627.596858578" watchObservedRunningTime="2026-03-14 08:54:52.350916432 +0000 UTC m=+1627.599368089" Mar 14 08:54:58 crc kubenswrapper[4886]: I0314 08:54:58.421037 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:54:58 crc kubenswrapper[4886]: E0314 08:54:58.421778 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:55:04 crc kubenswrapper[4886]: I0314 08:55:04.679028 4886 generic.go:334] "Generic (PLEG): container finished" podID="a9632d51-2405-4118-a547-fc6a0e6e5c42" containerID="163614a5306d6ab5c0569e211d1ef9f6cd9618070883a26d480217a0cbf645d4" exitCode=0 Mar 14 08:55:04 crc kubenswrapper[4886]: I0314 08:55:04.679291 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" event={"ID":"a9632d51-2405-4118-a547-fc6a0e6e5c42","Type":"ContainerDied","Data":"163614a5306d6ab5c0569e211d1ef9f6cd9618070883a26d480217a0cbf645d4"} Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.125384 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.265938 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-inventory\") pod \"a9632d51-2405-4118-a547-fc6a0e6e5c42\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.266376 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbrbp\" (UniqueName: \"kubernetes.io/projected/a9632d51-2405-4118-a547-fc6a0e6e5c42-kube-api-access-rbrbp\") pod \"a9632d51-2405-4118-a547-fc6a0e6e5c42\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.266463 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-repo-setup-combined-ca-bundle\") pod \"a9632d51-2405-4118-a547-fc6a0e6e5c42\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.266509 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-ssh-key-openstack-edpm-ipam\") pod \"a9632d51-2405-4118-a547-fc6a0e6e5c42\" (UID: \"a9632d51-2405-4118-a547-fc6a0e6e5c42\") " Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.272342 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9632d51-2405-4118-a547-fc6a0e6e5c42-kube-api-access-rbrbp" (OuterVolumeSpecName: "kube-api-access-rbrbp") pod "a9632d51-2405-4118-a547-fc6a0e6e5c42" (UID: "a9632d51-2405-4118-a547-fc6a0e6e5c42"). InnerVolumeSpecName "kube-api-access-rbrbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.275025 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a9632d51-2405-4118-a547-fc6a0e6e5c42" (UID: "a9632d51-2405-4118-a547-fc6a0e6e5c42"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.294981 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a9632d51-2405-4118-a547-fc6a0e6e5c42" (UID: "a9632d51-2405-4118-a547-fc6a0e6e5c42"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.315924 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-inventory" (OuterVolumeSpecName: "inventory") pod "a9632d51-2405-4118-a547-fc6a0e6e5c42" (UID: "a9632d51-2405-4118-a547-fc6a0e6e5c42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.369038 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbrbp\" (UniqueName: \"kubernetes.io/projected/a9632d51-2405-4118-a547-fc6a0e6e5c42-kube-api-access-rbrbp\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.369089 4886 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.369105 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.369136 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9632d51-2405-4118-a547-fc6a0e6e5c42-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.700252 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" event={"ID":"a9632d51-2405-4118-a547-fc6a0e6e5c42","Type":"ContainerDied","Data":"661d05c92562e17084375c508cf0cb135ee669cfb07f45f04712b2dc96441e50"} Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.700299 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="661d05c92562e17084375c508cf0cb135ee669cfb07f45f04712b2dc96441e50" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.700342 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.820714 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4"] Mar 14 08:55:06 crc kubenswrapper[4886]: E0314 08:55:06.821138 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9632d51-2405-4118-a547-fc6a0e6e5c42" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.821156 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9632d51-2405-4118-a547-fc6a0e6e5c42" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.821384 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9632d51-2405-4118-a547-fc6a0e6e5c42" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.822000 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.825363 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.825435 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.828283 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.830097 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.840523 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4"] Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.883204 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rvw\" (UniqueName: \"kubernetes.io/projected/baedaad9-0945-4c50-9ca1-aa71c90e3298-kube-api-access-w2rvw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.883291 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.883483 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.985262 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rvw\" (UniqueName: \"kubernetes.io/projected/baedaad9-0945-4c50-9ca1-aa71c90e3298-kube-api-access-w2rvw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.985359 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.985407 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.988888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:06 crc kubenswrapper[4886]: I0314 08:55:06.992787 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:07 crc kubenswrapper[4886]: I0314 08:55:07.008030 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rvw\" (UniqueName: \"kubernetes.io/projected/baedaad9-0945-4c50-9ca1-aa71c90e3298-kube-api-access-w2rvw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-rwpq4\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:07 crc kubenswrapper[4886]: I0314 08:55:07.136248 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:07 crc kubenswrapper[4886]: I0314 08:55:07.824437 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4"] Mar 14 08:55:08 crc kubenswrapper[4886]: I0314 08:55:08.721073 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" event={"ID":"baedaad9-0945-4c50-9ca1-aa71c90e3298","Type":"ContainerStarted","Data":"1285ae30f26a6c8e61caa19d9768454f4fb09408f2f9e2f8c89b3d20629bd22c"} Mar 14 08:55:08 crc kubenswrapper[4886]: I0314 08:55:08.721414 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" event={"ID":"baedaad9-0945-4c50-9ca1-aa71c90e3298","Type":"ContainerStarted","Data":"ba48525014cb7aa86ed3ffb5e5993725a7f65d5d09115f2a7c9a813998236bb2"} Mar 14 08:55:08 crc kubenswrapper[4886]: I0314 08:55:08.742417 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" podStartSLOduration=2.323381641 podStartE2EDuration="2.742397637s" podCreationTimestamp="2026-03-14 08:55:06 +0000 UTC" firstStartedPulling="2026-03-14 08:55:07.821979217 +0000 UTC m=+1643.070430874" lastFinishedPulling="2026-03-14 08:55:08.240995233 +0000 UTC m=+1643.489446870" observedRunningTime="2026-03-14 08:55:08.733536934 +0000 UTC m=+1643.981988581" watchObservedRunningTime="2026-03-14 08:55:08.742397637 +0000 UTC m=+1643.990849284" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.289690 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tvmrk"] Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.291728 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.308901 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvmrk"] Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.376155 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-utilities\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.376280 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wcx\" (UniqueName: \"kubernetes.io/projected/f63a9079-d287-4e90-927b-388166d9f766-kube-api-access-d6wcx\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.376318 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-catalog-content\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.477768 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-utilities\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.477849 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wcx\" (UniqueName: \"kubernetes.io/projected/f63a9079-d287-4e90-927b-388166d9f766-kube-api-access-d6wcx\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.477870 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-catalog-content\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.478299 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-utilities\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.478345 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-catalog-content\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.506037 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wcx\" (UniqueName: \"kubernetes.io/projected/f63a9079-d287-4e90-927b-388166d9f766-kube-api-access-d6wcx\") pod \"redhat-marketplace-tvmrk\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:10 crc kubenswrapper[4886]: I0314 08:55:10.611738 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.078859 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvmrk"] Mar 14 08:55:11 crc kubenswrapper[4886]: W0314 08:55:11.087716 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63a9079_d287_4e90_927b_388166d9f766.slice/crio-8817fed741f8f06e1e1c1bb9d8a1eb4bee1ce186edd54baf3221f6532d645c1d WatchSource:0}: Error finding container 8817fed741f8f06e1e1c1bb9d8a1eb4bee1ce186edd54baf3221f6532d645c1d: Status 404 returned error can't find the container with id 8817fed741f8f06e1e1c1bb9d8a1eb4bee1ce186edd54baf3221f6532d645c1d Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.420702 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:55:11 crc kubenswrapper[4886]: E0314 08:55:11.422402 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.765979 4886 generic.go:334] "Generic (PLEG): container finished" podID="f63a9079-d287-4e90-927b-388166d9f766" containerID="4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369" exitCode=0 Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.766062 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvmrk" event={"ID":"f63a9079-d287-4e90-927b-388166d9f766","Type":"ContainerDied","Data":"4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369"} Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.766094 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvmrk" event={"ID":"f63a9079-d287-4e90-927b-388166d9f766","Type":"ContainerStarted","Data":"8817fed741f8f06e1e1c1bb9d8a1eb4bee1ce186edd54baf3221f6532d645c1d"} Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.767908 4886 generic.go:334] "Generic (PLEG): container finished" podID="baedaad9-0945-4c50-9ca1-aa71c90e3298" containerID="1285ae30f26a6c8e61caa19d9768454f4fb09408f2f9e2f8c89b3d20629bd22c" exitCode=0 Mar 14 08:55:11 crc kubenswrapper[4886]: I0314 08:55:11.767934 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" event={"ID":"baedaad9-0945-4c50-9ca1-aa71c90e3298","Type":"ContainerDied","Data":"1285ae30f26a6c8e61caa19d9768454f4fb09408f2f9e2f8c89b3d20629bd22c"} Mar 14 08:55:12 crc kubenswrapper[4886]: I0314 08:55:12.781919 4886 generic.go:334] "Generic (PLEG): container finished" podID="f63a9079-d287-4e90-927b-388166d9f766" containerID="f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1" exitCode=0 Mar 14 08:55:12 crc kubenswrapper[4886]: I0314 08:55:12.781965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvmrk" event={"ID":"f63a9079-d287-4e90-927b-388166d9f766","Type":"ContainerDied","Data":"f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1"} Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.177603 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.233029 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-inventory\") pod \"baedaad9-0945-4c50-9ca1-aa71c90e3298\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.233245 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-ssh-key-openstack-edpm-ipam\") pod \"baedaad9-0945-4c50-9ca1-aa71c90e3298\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.233340 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2rvw\" (UniqueName: \"kubernetes.io/projected/baedaad9-0945-4c50-9ca1-aa71c90e3298-kube-api-access-w2rvw\") pod \"baedaad9-0945-4c50-9ca1-aa71c90e3298\" (UID: \"baedaad9-0945-4c50-9ca1-aa71c90e3298\") " Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.238535 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baedaad9-0945-4c50-9ca1-aa71c90e3298-kube-api-access-w2rvw" (OuterVolumeSpecName: "kube-api-access-w2rvw") pod "baedaad9-0945-4c50-9ca1-aa71c90e3298" (UID: "baedaad9-0945-4c50-9ca1-aa71c90e3298"). InnerVolumeSpecName "kube-api-access-w2rvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.266789 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "baedaad9-0945-4c50-9ca1-aa71c90e3298" (UID: "baedaad9-0945-4c50-9ca1-aa71c90e3298"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.283950 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-inventory" (OuterVolumeSpecName: "inventory") pod "baedaad9-0945-4c50-9ca1-aa71c90e3298" (UID: "baedaad9-0945-4c50-9ca1-aa71c90e3298"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.335795 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.335826 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baedaad9-0945-4c50-9ca1-aa71c90e3298-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.335837 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2rvw\" (UniqueName: \"kubernetes.io/projected/baedaad9-0945-4c50-9ca1-aa71c90e3298-kube-api-access-w2rvw\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.811704 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.812448 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-rwpq4" event={"ID":"baedaad9-0945-4c50-9ca1-aa71c90e3298","Type":"ContainerDied","Data":"ba48525014cb7aa86ed3ffb5e5993725a7f65d5d09115f2a7c9a813998236bb2"} Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.812490 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba48525014cb7aa86ed3ffb5e5993725a7f65d5d09115f2a7c9a813998236bb2" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.820005 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvmrk" event={"ID":"f63a9079-d287-4e90-927b-388166d9f766","Type":"ContainerStarted","Data":"790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd"} Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.857844 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tvmrk" podStartSLOduration=2.276016968 podStartE2EDuration="3.857824553s" podCreationTimestamp="2026-03-14 08:55:10 +0000 UTC" firstStartedPulling="2026-03-14 08:55:11.767628814 +0000 UTC m=+1647.016080451" lastFinishedPulling="2026-03-14 08:55:13.349436409 +0000 UTC m=+1648.597888036" observedRunningTime="2026-03-14 08:55:13.845376148 +0000 UTC m=+1649.093827785" watchObservedRunningTime="2026-03-14 08:55:13.857824553 +0000 UTC m=+1649.106276190" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.889402 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9"] Mar 14 08:55:13 crc kubenswrapper[4886]: E0314 08:55:13.889909 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baedaad9-0945-4c50-9ca1-aa71c90e3298" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.889934 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="baedaad9-0945-4c50-9ca1-aa71c90e3298" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.890253 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="baedaad9-0945-4c50-9ca1-aa71c90e3298" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.891006 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.893362 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.893574 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.893848 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.893949 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.900551 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9"] Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.947194 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.947291 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.947711 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrzrh\" (UniqueName: \"kubernetes.io/projected/bbd0a941-8eab-4742-9002-b42381f0d326-kube-api-access-wrzrh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:13 crc kubenswrapper[4886]: I0314 08:55:13.947860 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.049450 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.049575 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrzrh\" (UniqueName: \"kubernetes.io/projected/bbd0a941-8eab-4742-9002-b42381f0d326-kube-api-access-wrzrh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.049616 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.049666 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.056946 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.058713 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.059945 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.069173 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrzrh\" (UniqueName: \"kubernetes.io/projected/bbd0a941-8eab-4742-9002-b42381f0d326-kube-api-access-wrzrh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.224598 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.782872 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9"] Mar 14 08:55:14 crc kubenswrapper[4886]: I0314 08:55:14.844674 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" event={"ID":"bbd0a941-8eab-4742-9002-b42381f0d326","Type":"ContainerStarted","Data":"52d98d3a40aba6d3f69246c24370d050534b5e02f40968c40d5ded941ecc8eb3"} Mar 14 08:55:15 crc kubenswrapper[4886]: I0314 08:55:15.855639 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" event={"ID":"bbd0a941-8eab-4742-9002-b42381f0d326","Type":"ContainerStarted","Data":"ff8101a27028647d1b7c984002ee34141e266d66c5e2cb11c5a5a73eaf21a674"} Mar 14 08:55:15 crc kubenswrapper[4886]: I0314 08:55:15.880668 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" podStartSLOduration=2.442005915 podStartE2EDuration="2.880650721s" podCreationTimestamp="2026-03-14 08:55:13 +0000 UTC" firstStartedPulling="2026-03-14 08:55:14.79884031 +0000 UTC m=+1650.047291947" lastFinishedPulling="2026-03-14 08:55:15.237485116 +0000 UTC m=+1650.485936753" observedRunningTime="2026-03-14 08:55:15.872080977 +0000 UTC m=+1651.120532624" watchObservedRunningTime="2026-03-14 08:55:15.880650721 +0000 UTC m=+1651.129102358" Mar 14 08:55:20 crc kubenswrapper[4886]: I0314 08:55:20.612711 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:20 crc kubenswrapper[4886]: I0314 08:55:20.613322 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:20 crc kubenswrapper[4886]: I0314 08:55:20.663410 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:20 crc kubenswrapper[4886]: I0314 08:55:20.953173 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:21 crc kubenswrapper[4886]: I0314 08:55:21.000199 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvmrk"] Mar 14 08:55:22 crc kubenswrapper[4886]: I0314 08:55:22.420296 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:55:22 crc kubenswrapper[4886]: E0314 08:55:22.420597 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:55:22 crc kubenswrapper[4886]: I0314 08:55:22.919837 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tvmrk" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="registry-server" containerID="cri-o://790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd" gracePeriod=2 Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.926102 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.934553 4886 generic.go:334] "Generic (PLEG): container finished" podID="f63a9079-d287-4e90-927b-388166d9f766" containerID="790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd" exitCode=0 Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.934603 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvmrk" event={"ID":"f63a9079-d287-4e90-927b-388166d9f766","Type":"ContainerDied","Data":"790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd"} Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.934624 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvmrk" Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.934645 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvmrk" event={"ID":"f63a9079-d287-4e90-927b-388166d9f766","Type":"ContainerDied","Data":"8817fed741f8f06e1e1c1bb9d8a1eb4bee1ce186edd54baf3221f6532d645c1d"} Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.934668 4886 scope.go:117] "RemoveContainer" containerID="790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd" Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.965584 4886 scope.go:117] "RemoveContainer" containerID="f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1" Mar 14 08:55:23 crc kubenswrapper[4886]: I0314 08:55:23.987703 4886 scope.go:117] "RemoveContainer" containerID="4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.040721 4886 scope.go:117] "RemoveContainer" containerID="790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd" Mar 14 08:55:24 crc kubenswrapper[4886]: E0314 08:55:24.041256 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd\": container with ID starting with 790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd not found: ID does not exist" containerID="790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.041400 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd"} err="failed to get container status \"790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd\": rpc error: code = NotFound desc = could not find container \"790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd\": container with ID starting with 790d6a2bd4f2b0e7a4b6b502de096ddf22d887a0f76e9efc91afbebf78a118fd not found: ID does not exist" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.041491 4886 scope.go:117] "RemoveContainer" containerID="f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1" Mar 14 08:55:24 crc kubenswrapper[4886]: E0314 08:55:24.042005 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1\": container with ID starting with f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1 not found: ID does not exist" containerID="f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.042112 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1"} err="failed to get container status \"f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1\": rpc error: code = NotFound desc = could not find container \"f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1\": container with ID starting with f026db5a9af2cb560358264bc89615ae7151a5bb2e4c9c62e7781aee3c0defc1 not found: ID does not exist" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.042218 4886 scope.go:117] "RemoveContainer" containerID="4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369" Mar 14 08:55:24 crc kubenswrapper[4886]: E0314 08:55:24.042638 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369\": container with ID starting with 4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369 not found: ID does not exist" containerID="4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.042681 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369"} err="failed to get container status \"4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369\": rpc error: code = NotFound desc = could not find container \"4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369\": container with ID starting with 4d29f30aac4bc0b76a3ad705c2abaf6fb8c2bd8db24e698cb63bc940ae6b2369 not found: ID does not exist" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.050727 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6wcx\" (UniqueName: \"kubernetes.io/projected/f63a9079-d287-4e90-927b-388166d9f766-kube-api-access-d6wcx\") pod \"f63a9079-d287-4e90-927b-388166d9f766\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.050798 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-utilities\") pod \"f63a9079-d287-4e90-927b-388166d9f766\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.050876 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-catalog-content\") pod \"f63a9079-d287-4e90-927b-388166d9f766\" (UID: \"f63a9079-d287-4e90-927b-388166d9f766\") " Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.051796 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-utilities" (OuterVolumeSpecName: "utilities") pod "f63a9079-d287-4e90-927b-388166d9f766" (UID: "f63a9079-d287-4e90-927b-388166d9f766"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.057108 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63a9079-d287-4e90-927b-388166d9f766-kube-api-access-d6wcx" (OuterVolumeSpecName: "kube-api-access-d6wcx") pod "f63a9079-d287-4e90-927b-388166d9f766" (UID: "f63a9079-d287-4e90-927b-388166d9f766"). InnerVolumeSpecName "kube-api-access-d6wcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.075586 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63a9079-d287-4e90-927b-388166d9f766" (UID: "f63a9079-d287-4e90-927b-388166d9f766"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.153826 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6wcx\" (UniqueName: \"kubernetes.io/projected/f63a9079-d287-4e90-927b-388166d9f766-kube-api-access-d6wcx\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.154029 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.154088 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63a9079-d287-4e90-927b-388166d9f766-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.282437 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvmrk"] Mar 14 08:55:24 crc kubenswrapper[4886]: I0314 08:55:24.296058 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvmrk"] Mar 14 08:55:25 crc kubenswrapper[4886]: I0314 08:55:25.436982 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63a9079-d287-4e90-927b-388166d9f766" path="/var/lib/kubelet/pods/f63a9079-d287-4e90-927b-388166d9f766/volumes" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.099870 4886 scope.go:117] "RemoveContainer" containerID="d7ff59454da068f6047f5940d664fe653fd467819905375506175e119ee424df" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.125998 4886 scope.go:117] "RemoveContainer" containerID="1b86dd8377d7bbc4a602b82fc6e615137da41ed100266616b558bb02b1da6146" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.189308 4886 scope.go:117] "RemoveContainer" containerID="c557d68565c355725dc833f4c34233122fe9b9c234cd250baae8028c09843185" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.228499 4886 scope.go:117] "RemoveContainer" containerID="5d4f33cc733c4c780428838b0c1aa93986d723691a5fb2a33eb68f42aa298bee" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.329149 4886 scope.go:117] "RemoveContainer" containerID="00926642a717f74835da4263cf7f52ff53875af39363c75fb092c888a5b6727b" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.367367 4886 scope.go:117] "RemoveContainer" containerID="5e8eda8bba47b1698063a394f01c4cc5e37841d35ab22fe71b1c947e4fc8e0f4" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.412655 4886 scope.go:117] "RemoveContainer" containerID="e4e3d8542c0ff153c71f271b40046075bbe9c3157333be6695b54597867dafee" Mar 14 08:55:31 crc kubenswrapper[4886]: I0314 08:55:31.478586 4886 scope.go:117] "RemoveContainer" containerID="dabf869885aa506944203f0c77528af1e2ee5750948366f2981c803617bca2e0" Mar 14 08:55:37 crc kubenswrapper[4886]: I0314 08:55:37.420523 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:55:37 crc kubenswrapper[4886]: E0314 08:55:37.421033 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:55:49 crc kubenswrapper[4886]: I0314 08:55:49.422733 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:55:49 crc kubenswrapper[4886]: E0314 08:55:49.423833 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.159701 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557976-lxpv6"] Mar 14 08:56:00 crc kubenswrapper[4886]: E0314 08:56:00.160983 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="extract-utilities" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.160997 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="extract-utilities" Mar 14 08:56:00 crc kubenswrapper[4886]: E0314 08:56:00.161024 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="extract-content" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.161030 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="extract-content" Mar 14 08:56:00 crc kubenswrapper[4886]: E0314 08:56:00.161089 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="registry-server" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.161095 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="registry-server" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.161489 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63a9079-d287-4e90-927b-388166d9f766" containerName="registry-server" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.162399 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.171832 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.172136 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.173357 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-lxpv6"] Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.191578 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.258825 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4w6l\" (UniqueName: \"kubernetes.io/projected/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2-kube-api-access-r4w6l\") pod \"auto-csr-approver-29557976-lxpv6\" (UID: \"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2\") " pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.361994 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4w6l\" (UniqueName: \"kubernetes.io/projected/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2-kube-api-access-r4w6l\") pod \"auto-csr-approver-29557976-lxpv6\" (UID: \"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2\") " pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.380322 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4w6l\" (UniqueName: \"kubernetes.io/projected/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2-kube-api-access-r4w6l\") pod \"auto-csr-approver-29557976-lxpv6\" (UID: \"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2\") " pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.511980 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:00 crc kubenswrapper[4886]: I0314 08:56:00.979022 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-lxpv6"] Mar 14 08:56:01 crc kubenswrapper[4886]: I0314 08:56:01.420332 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:56:01 crc kubenswrapper[4886]: E0314 08:56:01.420724 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:56:01 crc kubenswrapper[4886]: I0314 08:56:01.768089 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" event={"ID":"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2","Type":"ContainerStarted","Data":"f4ff11f5102139324ea17fef919b516b923c52d0fd5e89d83ef6e3a594210b15"} Mar 14 08:56:03 crc kubenswrapper[4886]: I0314 08:56:03.786740 4886 generic.go:334] "Generic (PLEG): container finished" podID="ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2" containerID="f111a63cf28cd315c5dadb37d0e511707539687b98e00cdd9df82cf9bb223ea3" exitCode=0 Mar 14 08:56:03 crc kubenswrapper[4886]: I0314 08:56:03.787005 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" event={"ID":"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2","Type":"ContainerDied","Data":"f111a63cf28cd315c5dadb37d0e511707539687b98e00cdd9df82cf9bb223ea3"} Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.206336 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.362799 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4w6l\" (UniqueName: \"kubernetes.io/projected/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2-kube-api-access-r4w6l\") pod \"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2\" (UID: \"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2\") " Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.370585 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2-kube-api-access-r4w6l" (OuterVolumeSpecName: "kube-api-access-r4w6l") pod "ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2" (UID: "ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2"). InnerVolumeSpecName "kube-api-access-r4w6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.465485 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4w6l\" (UniqueName: \"kubernetes.io/projected/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2-kube-api-access-r4w6l\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.808380 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" event={"ID":"ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2","Type":"ContainerDied","Data":"f4ff11f5102139324ea17fef919b516b923c52d0fd5e89d83ef6e3a594210b15"} Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.808900 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4ff11f5102139324ea17fef919b516b923c52d0fd5e89d83ef6e3a594210b15" Mar 14 08:56:05 crc kubenswrapper[4886]: I0314 08:56:05.809034 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557976-lxpv6" Mar 14 08:56:06 crc kubenswrapper[4886]: I0314 08:56:06.287042 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-k52mf"] Mar 14 08:56:06 crc kubenswrapper[4886]: I0314 08:56:06.300781 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557970-k52mf"] Mar 14 08:56:07 crc kubenswrapper[4886]: I0314 08:56:07.431182 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bba36a0-936a-4da3-a23b-79068d0c437c" path="/var/lib/kubelet/pods/8bba36a0-936a-4da3-a23b-79068d0c437c/volumes" Mar 14 08:56:14 crc kubenswrapper[4886]: I0314 08:56:14.421177 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:56:14 crc kubenswrapper[4886]: E0314 08:56:14.421981 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.342729 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9k8f"] Mar 14 08:56:24 crc kubenswrapper[4886]: E0314 08:56:24.343828 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2" containerName="oc" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.343846 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2" containerName="oc" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.344049 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2" containerName="oc" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.345466 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.405721 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9k8f"] Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.450033 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-utilities\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.450093 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzw8\" (UniqueName: \"kubernetes.io/projected/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-kube-api-access-dzzw8\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.450293 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-catalog-content\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.551663 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-catalog-content\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.551813 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-utilities\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.551843 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzw8\" (UniqueName: \"kubernetes.io/projected/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-kube-api-access-dzzw8\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.552195 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-catalog-content\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.552264 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-utilities\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.569977 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzw8\" (UniqueName: \"kubernetes.io/projected/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-kube-api-access-dzzw8\") pod \"redhat-operators-n9k8f\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:24 crc kubenswrapper[4886]: I0314 08:56:24.678953 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:25 crc kubenswrapper[4886]: I0314 08:56:25.160611 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9k8f"] Mar 14 08:56:26 crc kubenswrapper[4886]: I0314 08:56:26.035988 4886 generic.go:334] "Generic (PLEG): container finished" podID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerID="a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad" exitCode=0 Mar 14 08:56:26 crc kubenswrapper[4886]: I0314 08:56:26.036031 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerDied","Data":"a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad"} Mar 14 08:56:26 crc kubenswrapper[4886]: I0314 08:56:26.036056 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerStarted","Data":"d7b5e30c744db6be70b4efbc7960f256ac6b55a56bcbce336bfeb75bdef64733"} Mar 14 08:56:27 crc kubenswrapper[4886]: I0314 08:56:27.047367 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerStarted","Data":"a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271"} Mar 14 08:56:29 crc kubenswrapper[4886]: I0314 08:56:29.421568 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:56:29 crc kubenswrapper[4886]: E0314 08:56:29.422226 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:56:30 crc kubenswrapper[4886]: I0314 08:56:30.076259 4886 generic.go:334] "Generic (PLEG): container finished" podID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerID="a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271" exitCode=0 Mar 14 08:56:30 crc kubenswrapper[4886]: I0314 08:56:30.076313 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerDied","Data":"a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271"} Mar 14 08:56:31 crc kubenswrapper[4886]: I0314 08:56:31.088543 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerStarted","Data":"df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d"} Mar 14 08:56:31 crc kubenswrapper[4886]: I0314 08:56:31.113673 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9k8f" podStartSLOduration=2.544990237 podStartE2EDuration="7.113646785s" podCreationTimestamp="2026-03-14 08:56:24 +0000 UTC" firstStartedPulling="2026-03-14 08:56:26.037709186 +0000 UTC m=+1721.286160813" lastFinishedPulling="2026-03-14 08:56:30.606365724 +0000 UTC m=+1725.854817361" observedRunningTime="2026-03-14 08:56:31.106494251 +0000 UTC m=+1726.354945888" watchObservedRunningTime="2026-03-14 08:56:31.113646785 +0000 UTC m=+1726.362098422" Mar 14 08:56:31 crc kubenswrapper[4886]: I0314 08:56:31.604398 4886 scope.go:117] "RemoveContainer" containerID="b7605021f0f98c16c58d134e47b74276c8417dbb6ed58cc20dce2d6d34263450" Mar 14 08:56:31 crc kubenswrapper[4886]: I0314 08:56:31.639604 4886 scope.go:117] "RemoveContainer" containerID="c7daa5ec8881475125e070d27c16827991310384862a54e501aab33e4edb2588" Mar 14 08:56:31 crc kubenswrapper[4886]: I0314 08:56:31.662334 4886 scope.go:117] "RemoveContainer" containerID="7549d59a1101a59e8fc3fd0f7cc18a420bea35c7279ccb049806789c71220744" Mar 14 08:56:34 crc kubenswrapper[4886]: I0314 08:56:34.680397 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:34 crc kubenswrapper[4886]: I0314 08:56:34.680923 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:35 crc kubenswrapper[4886]: I0314 08:56:35.730010 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n9k8f" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="registry-server" probeResult="failure" output=< Mar 14 08:56:35 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 08:56:35 crc kubenswrapper[4886]: > Mar 14 08:56:44 crc kubenswrapper[4886]: I0314 08:56:44.420354 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:56:44 crc kubenswrapper[4886]: E0314 08:56:44.420977 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:56:44 crc kubenswrapper[4886]: I0314 08:56:44.780272 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:44 crc kubenswrapper[4886]: I0314 08:56:44.852053 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:45 crc kubenswrapper[4886]: I0314 08:56:45.029061 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9k8f"] Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.244714 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9k8f" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="registry-server" containerID="cri-o://df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d" gracePeriod=2 Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.715557 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.865942 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzw8\" (UniqueName: \"kubernetes.io/projected/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-kube-api-access-dzzw8\") pod \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.866034 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-utilities\") pod \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.866328 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-catalog-content\") pod \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\" (UID: \"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08\") " Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.866740 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-utilities" (OuterVolumeSpecName: "utilities") pod "a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" (UID: "a7e0334f-a6c4-478f-ae80-bc4cb79c1f08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.867008 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.873912 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-kube-api-access-dzzw8" (OuterVolumeSpecName: "kube-api-access-dzzw8") pod "a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" (UID: "a7e0334f-a6c4-478f-ae80-bc4cb79c1f08"). InnerVolumeSpecName "kube-api-access-dzzw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.969823 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzzw8\" (UniqueName: \"kubernetes.io/projected/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-kube-api-access-dzzw8\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:46 crc kubenswrapper[4886]: I0314 08:56:46.992787 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" (UID: "a7e0334f-a6c4-478f-ae80-bc4cb79c1f08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.071719 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.257794 4886 generic.go:334] "Generic (PLEG): container finished" podID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerID="df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d" exitCode=0 Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.257850 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerDied","Data":"df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d"} Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.257883 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k8f" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.257917 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k8f" event={"ID":"a7e0334f-a6c4-478f-ae80-bc4cb79c1f08","Type":"ContainerDied","Data":"d7b5e30c744db6be70b4efbc7960f256ac6b55a56bcbce336bfeb75bdef64733"} Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.257950 4886 scope.go:117] "RemoveContainer" containerID="df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.320844 4886 scope.go:117] "RemoveContainer" containerID="a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.326968 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9k8f"] Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.337792 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9k8f"] Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.355091 4886 scope.go:117] "RemoveContainer" containerID="a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.393996 4886 scope.go:117] "RemoveContainer" containerID="df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d" Mar 14 08:56:47 crc kubenswrapper[4886]: E0314 08:56:47.394374 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d\": container with ID starting with df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d not found: ID does not exist" containerID="df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.394406 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d"} err="failed to get container status \"df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d\": rpc error: code = NotFound desc = could not find container \"df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d\": container with ID starting with df6c23f1a9d89fd9f11cf1d8d65de67ae8ebda164657bd151577b3a5a9584a2d not found: ID does not exist" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.394429 4886 scope.go:117] "RemoveContainer" containerID="a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271" Mar 14 08:56:47 crc kubenswrapper[4886]: E0314 08:56:47.394637 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271\": container with ID starting with a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271 not found: ID does not exist" containerID="a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.394659 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271"} err="failed to get container status \"a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271\": rpc error: code = NotFound desc = could not find container \"a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271\": container with ID starting with a32af4e91e1ff4cc2a3f112731a5254fecef07a65dd8ffa938ad5edb29a9a271 not found: ID does not exist" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.394674 4886 scope.go:117] "RemoveContainer" containerID="a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad" Mar 14 08:56:47 crc kubenswrapper[4886]: E0314 08:56:47.394833 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad\": container with ID starting with a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad not found: ID does not exist" containerID="a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.394852 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad"} err="failed to get container status \"a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad\": rpc error: code = NotFound desc = could not find container \"a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad\": container with ID starting with a6072cb782c6c61d91e1ad77d63c4c5ba2a1e95dfcc5be4f2caf4157fefc3bad not found: ID does not exist" Mar 14 08:56:47 crc kubenswrapper[4886]: I0314 08:56:47.431901 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" path="/var/lib/kubelet/pods/a7e0334f-a6c4-478f-ae80-bc4cb79c1f08/volumes" Mar 14 08:56:59 crc kubenswrapper[4886]: I0314 08:56:59.422196 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:56:59 crc kubenswrapper[4886]: E0314 08:56:59.423085 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:57:14 crc kubenswrapper[4886]: I0314 08:57:14.421527 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:57:14 crc kubenswrapper[4886]: E0314 08:57:14.422317 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:57:28 crc kubenswrapper[4886]: I0314 08:57:28.565023 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:57:28 crc kubenswrapper[4886]: E0314 08:57:28.565675 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:57:32 crc kubenswrapper[4886]: I0314 08:57:32.202393 4886 scope.go:117] "RemoveContainer" containerID="0b271cd3bc2037d7c929d5b5e6bd1566c4ecec0be5a0094010de6895e43e0c7f" Mar 14 08:57:32 crc kubenswrapper[4886]: I0314 08:57:32.234887 4886 scope.go:117] "RemoveContainer" containerID="392c2eb03365881be657992e4a96125dcbebff3b67dcc23ccd2f69b843290d50" Mar 14 08:57:32 crc kubenswrapper[4886]: I0314 08:57:32.254448 4886 scope.go:117] "RemoveContainer" containerID="0c84c5c78f9fc6834ff674960e4a9e69eca599144056d8d5edf7fce36760a11e" Mar 14 08:57:32 crc kubenswrapper[4886]: I0314 08:57:32.280509 4886 scope.go:117] "RemoveContainer" containerID="51699cabba0b86ef2fdbc6c70cfaf95d0bf6d915e40fa430ce1757a991a1b072" Mar 14 08:57:40 crc kubenswrapper[4886]: I0314 08:57:40.367503 4886 patch_prober.go:28] interesting pod/controller-manager-66cf58567-vgs2d container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.77:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 14 08:57:40 crc kubenswrapper[4886]: I0314 08:57:40.370789 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-66cf58567-vgs2d" podUID="1ff77274-c0cf-4ef7-8a62-5ca93de936ac" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.77:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 08:57:40 crc kubenswrapper[4886]: I0314 08:57:40.384242 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:57:40 crc kubenswrapper[4886]: E0314 08:57:40.384498 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:57:55 crc kubenswrapper[4886]: I0314 08:57:55.427240 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:57:55 crc kubenswrapper[4886]: E0314 08:57:55.427943 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.163473 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7sb9n"] Mar 14 08:58:00 crc kubenswrapper[4886]: E0314 08:58:00.164403 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="registry-server" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.164424 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="registry-server" Mar 14 08:58:00 crc kubenswrapper[4886]: E0314 08:58:00.164445 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="extract-content" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.164455 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="extract-content" Mar 14 08:58:00 crc kubenswrapper[4886]: E0314 08:58:00.164511 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="extract-utilities" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.164528 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="extract-utilities" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.164819 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e0334f-a6c4-478f-ae80-bc4cb79c1f08" containerName="registry-server" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.165816 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.171460 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.172275 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.172413 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.193022 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7sb9n"] Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.294046 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqdsg\" (UniqueName: \"kubernetes.io/projected/d3c73e07-5b8d-47b4-8c11-2f37825df8fe-kube-api-access-kqdsg\") pod \"auto-csr-approver-29557978-7sb9n\" (UID: \"d3c73e07-5b8d-47b4-8c11-2f37825df8fe\") " pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.396963 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqdsg\" (UniqueName: \"kubernetes.io/projected/d3c73e07-5b8d-47b4-8c11-2f37825df8fe-kube-api-access-kqdsg\") pod \"auto-csr-approver-29557978-7sb9n\" (UID: \"d3c73e07-5b8d-47b4-8c11-2f37825df8fe\") " pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.420818 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqdsg\" (UniqueName: \"kubernetes.io/projected/d3c73e07-5b8d-47b4-8c11-2f37825df8fe-kube-api-access-kqdsg\") pod \"auto-csr-approver-29557978-7sb9n\" (UID: \"d3c73e07-5b8d-47b4-8c11-2f37825df8fe\") " pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.492314 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:00 crc kubenswrapper[4886]: I0314 08:58:00.939106 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7sb9n"] Mar 14 08:58:01 crc kubenswrapper[4886]: I0314 08:58:01.667750 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" event={"ID":"d3c73e07-5b8d-47b4-8c11-2f37825df8fe","Type":"ContainerStarted","Data":"c00c5827cdfb76e334a43916163627d8b69ad46757fcb45250deb022145c81dd"} Mar 14 08:58:02 crc kubenswrapper[4886]: I0314 08:58:02.681620 4886 generic.go:334] "Generic (PLEG): container finished" podID="d3c73e07-5b8d-47b4-8c11-2f37825df8fe" containerID="8364b1b519be961628d3418d52599a0032331ed0b646fd02bd6b8b299c30b401" exitCode=0 Mar 14 08:58:02 crc kubenswrapper[4886]: I0314 08:58:02.681683 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" event={"ID":"d3c73e07-5b8d-47b4-8c11-2f37825df8fe","Type":"ContainerDied","Data":"8364b1b519be961628d3418d52599a0032331ed0b646fd02bd6b8b299c30b401"} Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.044503 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.182736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqdsg\" (UniqueName: \"kubernetes.io/projected/d3c73e07-5b8d-47b4-8c11-2f37825df8fe-kube-api-access-kqdsg\") pod \"d3c73e07-5b8d-47b4-8c11-2f37825df8fe\" (UID: \"d3c73e07-5b8d-47b4-8c11-2f37825df8fe\") " Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.188481 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c73e07-5b8d-47b4-8c11-2f37825df8fe-kube-api-access-kqdsg" (OuterVolumeSpecName: "kube-api-access-kqdsg") pod "d3c73e07-5b8d-47b4-8c11-2f37825df8fe" (UID: "d3c73e07-5b8d-47b4-8c11-2f37825df8fe"). InnerVolumeSpecName "kube-api-access-kqdsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.284849 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqdsg\" (UniqueName: \"kubernetes.io/projected/d3c73e07-5b8d-47b4-8c11-2f37825df8fe-kube-api-access-kqdsg\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.702085 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" event={"ID":"d3c73e07-5b8d-47b4-8c11-2f37825df8fe","Type":"ContainerDied","Data":"c00c5827cdfb76e334a43916163627d8b69ad46757fcb45250deb022145c81dd"} Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.702142 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00c5827cdfb76e334a43916163627d8b69ad46757fcb45250deb022145c81dd" Mar 14 08:58:04 crc kubenswrapper[4886]: I0314 08:58:04.702172 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557978-7sb9n" Mar 14 08:58:05 crc kubenswrapper[4886]: I0314 08:58:05.117675 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-zfqj5"] Mar 14 08:58:05 crc kubenswrapper[4886]: I0314 08:58:05.128570 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557972-zfqj5"] Mar 14 08:58:05 crc kubenswrapper[4886]: I0314 08:58:05.434647 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed23514f-ddc6-4359-ada9-147ca9d19bf9" path="/var/lib/kubelet/pods/ed23514f-ddc6-4359-ada9-147ca9d19bf9/volumes" Mar 14 08:58:07 crc kubenswrapper[4886]: I0314 08:58:07.420832 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:58:07 crc kubenswrapper[4886]: E0314 08:58:07.421471 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:58:21 crc kubenswrapper[4886]: I0314 08:58:21.421271 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:58:21 crc kubenswrapper[4886]: E0314 08:58:21.421866 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:58:22 crc kubenswrapper[4886]: I0314 08:58:22.879506 4886 generic.go:334] "Generic (PLEG): container finished" podID="bbd0a941-8eab-4742-9002-b42381f0d326" containerID="ff8101a27028647d1b7c984002ee34141e266d66c5e2cb11c5a5a73eaf21a674" exitCode=0 Mar 14 08:58:22 crc kubenswrapper[4886]: I0314 08:58:22.879600 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" event={"ID":"bbd0a941-8eab-4742-9002-b42381f0d326","Type":"ContainerDied","Data":"ff8101a27028647d1b7c984002ee34141e266d66c5e2cb11c5a5a73eaf21a674"} Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.311540 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.384154 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-inventory\") pod \"bbd0a941-8eab-4742-9002-b42381f0d326\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.384205 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrzrh\" (UniqueName: \"kubernetes.io/projected/bbd0a941-8eab-4742-9002-b42381f0d326-kube-api-access-wrzrh\") pod \"bbd0a941-8eab-4742-9002-b42381f0d326\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.384497 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-bootstrap-combined-ca-bundle\") pod \"bbd0a941-8eab-4742-9002-b42381f0d326\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.384570 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-ssh-key-openstack-edpm-ipam\") pod \"bbd0a941-8eab-4742-9002-b42381f0d326\" (UID: \"bbd0a941-8eab-4742-9002-b42381f0d326\") " Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.389647 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd0a941-8eab-4742-9002-b42381f0d326-kube-api-access-wrzrh" (OuterVolumeSpecName: "kube-api-access-wrzrh") pod "bbd0a941-8eab-4742-9002-b42381f0d326" (UID: "bbd0a941-8eab-4742-9002-b42381f0d326"). InnerVolumeSpecName "kube-api-access-wrzrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.390844 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bbd0a941-8eab-4742-9002-b42381f0d326" (UID: "bbd0a941-8eab-4742-9002-b42381f0d326"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.413681 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bbd0a941-8eab-4742-9002-b42381f0d326" (UID: "bbd0a941-8eab-4742-9002-b42381f0d326"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.415304 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-inventory" (OuterVolumeSpecName: "inventory") pod "bbd0a941-8eab-4742-9002-b42381f0d326" (UID: "bbd0a941-8eab-4742-9002-b42381f0d326"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.486653 4886 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.486684 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.486703 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbd0a941-8eab-4742-9002-b42381f0d326-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.486738 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrzrh\" (UniqueName: \"kubernetes.io/projected/bbd0a941-8eab-4742-9002-b42381f0d326-kube-api-access-wrzrh\") on node \"crc\" DevicePath \"\"" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.901849 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" event={"ID":"bbd0a941-8eab-4742-9002-b42381f0d326","Type":"ContainerDied","Data":"52d98d3a40aba6d3f69246c24370d050534b5e02f40968c40d5ded941ecc8eb3"} Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.902088 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52d98d3a40aba6d3f69246c24370d050534b5e02f40968c40d5ded941ecc8eb3" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.901913 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.993954 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv"] Mar 14 08:58:24 crc kubenswrapper[4886]: E0314 08:58:24.994495 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c73e07-5b8d-47b4-8c11-2f37825df8fe" containerName="oc" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.994516 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c73e07-5b8d-47b4-8c11-2f37825df8fe" containerName="oc" Mar 14 08:58:24 crc kubenswrapper[4886]: E0314 08:58:24.994551 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd0a941-8eab-4742-9002-b42381f0d326" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.994562 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd0a941-8eab-4742-9002-b42381f0d326" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.994804 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd0a941-8eab-4742-9002-b42381f0d326" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.994839 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c73e07-5b8d-47b4-8c11-2f37825df8fe" containerName="oc" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.995685 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.997536 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.998619 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.998720 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 08:58:24 crc kubenswrapper[4886]: I0314 08:58:24.998803 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.005229 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv"] Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.096436 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.096505 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtfk\" (UniqueName: \"kubernetes.io/projected/d6cbe588-9aee-4554-b985-c809186e86d9-kube-api-access-hbtfk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.096659 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.198581 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtfk\" (UniqueName: \"kubernetes.io/projected/d6cbe588-9aee-4554-b985-c809186e86d9-kube-api-access-hbtfk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.198824 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.198900 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.205058 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.207436 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.215049 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtfk\" (UniqueName: \"kubernetes.io/projected/d6cbe588-9aee-4554-b985-c809186e86d9-kube-api-access-hbtfk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.315401 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.832773 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv"] Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.838105 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 08:58:25 crc kubenswrapper[4886]: I0314 08:58:25.911369 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" event={"ID":"d6cbe588-9aee-4554-b985-c809186e86d9","Type":"ContainerStarted","Data":"b2b0c29c4b3a2df4497e4b16e941ae6b0b03a0d268073e6202438c5688a9e87a"} Mar 14 08:58:26 crc kubenswrapper[4886]: I0314 08:58:26.922641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" event={"ID":"d6cbe588-9aee-4554-b985-c809186e86d9","Type":"ContainerStarted","Data":"5d647916a372ae8d5ef8aad1fe292a39f539ce7f5e3e3eee10bf2735f979df8b"} Mar 14 08:58:26 crc kubenswrapper[4886]: I0314 08:58:26.944812 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" podStartSLOduration=2.1750662370000002 podStartE2EDuration="2.944789841s" podCreationTimestamp="2026-03-14 08:58:24 +0000 UTC" firstStartedPulling="2026-03-14 08:58:25.837861474 +0000 UTC m=+1841.086313111" lastFinishedPulling="2026-03-14 08:58:26.607585038 +0000 UTC m=+1841.856036715" observedRunningTime="2026-03-14 08:58:26.936787253 +0000 UTC m=+1842.185238890" watchObservedRunningTime="2026-03-14 08:58:26.944789841 +0000 UTC m=+1842.193241478" Mar 14 08:58:32 crc kubenswrapper[4886]: I0314 08:58:32.347443 4886 scope.go:117] "RemoveContainer" containerID="fdf93410e70fb0ee72eaa42914cc4ae6b6ea838f68db3ddab0e12a55e1214619" Mar 14 08:58:32 crc kubenswrapper[4886]: I0314 08:58:32.398395 4886 scope.go:117] "RemoveContainer" containerID="e0167f4101121823b1f0f5c81ed6c66a3164b73927b29003799d7f631c6248be" Mar 14 08:58:34 crc kubenswrapper[4886]: I0314 08:58:34.422042 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:58:34 crc kubenswrapper[4886]: E0314 08:58:34.422954 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:58:41 crc kubenswrapper[4886]: I0314 08:58:41.043982 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hb7vx"] Mar 14 08:58:41 crc kubenswrapper[4886]: I0314 08:58:41.054375 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hb7vx"] Mar 14 08:58:41 crc kubenswrapper[4886]: I0314 08:58:41.437351 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd659431-d053-4d3b-a7a8-7a9b20438242" path="/var/lib/kubelet/pods/fd659431-d053-4d3b-a7a8-7a9b20438242/volumes" Mar 14 08:58:42 crc kubenswrapper[4886]: I0314 08:58:42.032977 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2ed1-account-create-update-bfzn5"] Mar 14 08:58:42 crc kubenswrapper[4886]: I0314 08:58:42.073043 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-5lbzf"] Mar 14 08:58:42 crc kubenswrapper[4886]: I0314 08:58:42.085404 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2ed1-account-create-update-bfzn5"] Mar 14 08:58:42 crc kubenswrapper[4886]: I0314 08:58:42.095472 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-b1fc-account-create-update-87rmn"] Mar 14 08:58:42 crc kubenswrapper[4886]: I0314 08:58:42.105298 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-5lbzf"] Mar 14 08:58:42 crc kubenswrapper[4886]: I0314 08:58:42.114210 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-b1fc-account-create-update-87rmn"] Mar 14 08:58:43 crc kubenswrapper[4886]: I0314 08:58:43.430465 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba" path="/var/lib/kubelet/pods/18e0a8ff-b15a-49d2-a5ba-8e970dc7c3ba/volumes" Mar 14 08:58:43 crc kubenswrapper[4886]: I0314 08:58:43.431326 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67508ffc-b9b5-4172-b4fc-f6870f5f210a" path="/var/lib/kubelet/pods/67508ffc-b9b5-4172-b4fc-f6870f5f210a/volumes" Mar 14 08:58:43 crc kubenswrapper[4886]: I0314 08:58:43.431885 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74857ae3-5a84-4d38-9962-1836e71789da" path="/var/lib/kubelet/pods/74857ae3-5a84-4d38-9962-1836e71789da/volumes" Mar 14 08:58:47 crc kubenswrapper[4886]: I0314 08:58:47.040506 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lqwf5"] Mar 14 08:58:47 crc kubenswrapper[4886]: I0314 08:58:47.057218 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lqwf5"] Mar 14 08:58:47 crc kubenswrapper[4886]: I0314 08:58:47.440602 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c05b39b-9236-4a8a-ab68-c424153678a6" path="/var/lib/kubelet/pods/4c05b39b-9236-4a8a-ab68-c424153678a6/volumes" Mar 14 08:58:48 crc kubenswrapper[4886]: I0314 08:58:48.041416 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d883-account-create-update-98qk2"] Mar 14 08:58:48 crc kubenswrapper[4886]: I0314 08:58:48.054659 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d883-account-create-update-98qk2"] Mar 14 08:58:49 crc kubenswrapper[4886]: I0314 08:58:49.421091 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:58:49 crc kubenswrapper[4886]: E0314 08:58:49.421818 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 08:58:49 crc kubenswrapper[4886]: I0314 08:58:49.433086 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd30816-7ba6-49db-8bfa-52b31cbf4de5" path="/var/lib/kubelet/pods/1bd30816-7ba6-49db-8bfa-52b31cbf4de5/volumes" Mar 14 08:58:51 crc kubenswrapper[4886]: I0314 08:58:51.045957 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ad06-account-create-update-mpb4t"] Mar 14 08:58:51 crc kubenswrapper[4886]: I0314 08:58:51.058943 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fjk6q"] Mar 14 08:58:51 crc kubenswrapper[4886]: I0314 08:58:51.070758 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ad06-account-create-update-mpb4t"] Mar 14 08:58:51 crc kubenswrapper[4886]: I0314 08:58:51.083733 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fjk6q"] Mar 14 08:58:51 crc kubenswrapper[4886]: I0314 08:58:51.438090 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a9c950-955d-4afd-88c9-9c705ef619b6" path="/var/lib/kubelet/pods/11a9c950-955d-4afd-88c9-9c705ef619b6/volumes" Mar 14 08:58:51 crc kubenswrapper[4886]: I0314 08:58:51.439039 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c9025b-c8dc-498d-a0ea-b8770e22af45" path="/var/lib/kubelet/pods/d7c9025b-c8dc-498d-a0ea-b8770e22af45/volumes" Mar 14 08:58:55 crc kubenswrapper[4886]: I0314 08:58:55.035888 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jfvv5"] Mar 14 08:58:55 crc kubenswrapper[4886]: I0314 08:58:55.044187 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jfvv5"] Mar 14 08:58:55 crc kubenswrapper[4886]: I0314 08:58:55.442614 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672" path="/var/lib/kubelet/pods/0a59e6f3-fd54-43d4-8d4b-bdf8dbcf4672/volumes" Mar 14 08:59:00 crc kubenswrapper[4886]: I0314 08:59:00.421718 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 08:59:01 crc kubenswrapper[4886]: I0314 08:59:01.341891 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"8d76383c767d4d7cb6a175fd55c92a6a7210c3d14dfdddef7efb953dd6c3ec5b"} Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.037704 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-shtsx"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.058322 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ddvrp"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.069185 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-shtsx"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.078881 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ddvrp"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.087140 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-daf3-account-create-update-zh48j"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.099323 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-daf3-account-create-update-zh48j"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.110812 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-27cd-account-create-update-hpqtv"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.119475 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-27cd-account-create-update-hpqtv"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.127323 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2a02-account-create-update-ltf9w"] Mar 14 08:59:20 crc kubenswrapper[4886]: I0314 08:59:20.137039 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2a02-account-create-update-ltf9w"] Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.040300 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-r8f78"] Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.050084 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-r8f78"] Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.458498 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34853a4a-c21e-4d80-ad6a-b2af27041d14" path="/var/lib/kubelet/pods/34853a4a-c21e-4d80-ad6a-b2af27041d14/volumes" Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.460767 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5" path="/var/lib/kubelet/pods/3cb4df3e-9c36-4fbd-9e29-079f3e1bddb5/volumes" Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.461741 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542f6299-441b-4884-8ffb-2ea8b3c89e73" path="/var/lib/kubelet/pods/542f6299-441b-4884-8ffb-2ea8b3c89e73/volumes" Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.462561 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c6cc9d-015a-4e33-8ded-c912cb52dde2" path="/var/lib/kubelet/pods/99c6cc9d-015a-4e33-8ded-c912cb52dde2/volumes" Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.464253 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda99471-604e-402e-b068-82d6c2269f2b" path="/var/lib/kubelet/pods/eda99471-604e-402e-b068-82d6c2269f2b/volumes" Mar 14 08:59:21 crc kubenswrapper[4886]: I0314 08:59:21.465082 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59672dc-f49b-4bd3-aa93-f0161ac73cfd" path="/var/lib/kubelet/pods/f59672dc-f49b-4bd3-aa93-f0161ac73cfd/volumes" Mar 14 08:59:22 crc kubenswrapper[4886]: I0314 08:59:22.034217 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w889t"] Mar 14 08:59:22 crc kubenswrapper[4886]: I0314 08:59:22.044991 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w889t"] Mar 14 08:59:23 crc kubenswrapper[4886]: I0314 08:59:23.433407 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be4cce3-ae51-4d07-a9d9-ccc6152774b5" path="/var/lib/kubelet/pods/2be4cce3-ae51-4d07-a9d9-ccc6152774b5/volumes" Mar 14 08:59:27 crc kubenswrapper[4886]: I0314 08:59:27.045433 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-jgb85"] Mar 14 08:59:27 crc kubenswrapper[4886]: I0314 08:59:27.057580 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rc2gb"] Mar 14 08:59:27 crc kubenswrapper[4886]: I0314 08:59:27.069574 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-jgb85"] Mar 14 08:59:27 crc kubenswrapper[4886]: I0314 08:59:27.080025 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rc2gb"] Mar 14 08:59:27 crc kubenswrapper[4886]: I0314 08:59:27.431734 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9083ef8e-f321-4442-871b-c82f908bd073" path="/var/lib/kubelet/pods/9083ef8e-f321-4442-871b-c82f908bd073/volumes" Mar 14 08:59:27 crc kubenswrapper[4886]: I0314 08:59:27.432674 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb94c19d-031e-44b6-bdaa-39141d037b36" path="/var/lib/kubelet/pods/fb94c19d-031e-44b6-bdaa-39141d037b36/volumes" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.481824 4886 scope.go:117] "RemoveContainer" containerID="b822ed51869e5d780121065f8994b9261798e74731ec04f6b70a39cea9109454" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.512583 4886 scope.go:117] "RemoveContainer" containerID="fc19ed93114fc4349e21a148501e031903bc6d0724eb4aebbd13c07f631f53d9" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.552396 4886 scope.go:117] "RemoveContainer" containerID="b02ab9aa84b747b709acf832297faf86b3fe7352405d64c6d17b0e2993f27111" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.594111 4886 scope.go:117] "RemoveContainer" containerID="ff4d8d568b7cf3b962d01a45193d39fa29d6735102596fcc22da158ec0ffdd03" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.641893 4886 scope.go:117] "RemoveContainer" containerID="b30e8e2922a2d7dd59a8f2f469ac02d13785fdbe0a4dee671faf9451e96fdd08" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.700266 4886 scope.go:117] "RemoveContainer" containerID="1e47481845c53f199577f7274a38447001a233e5b7cc38957657c9fb9acfa05a" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.754557 4886 scope.go:117] "RemoveContainer" containerID="dd729342a3a7ad6f9543afe850b665d6ec185d8c478f4e2acbf0738c9d477c7c" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.775711 4886 scope.go:117] "RemoveContainer" containerID="d32413d0105f04819935ec902c232e262b47516499fbec9509d64b3f38ea0ff8" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.791878 4886 scope.go:117] "RemoveContainer" containerID="5eb01f83f18094ce0660e54bf4941eb3644f84e36a881c2d3515923b80734bef" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.817829 4886 scope.go:117] "RemoveContainer" containerID="fcd41b14fe79d4b69db16593dc87b657f2dc4f2358744e06441c38f9605dc34b" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.839212 4886 scope.go:117] "RemoveContainer" containerID="361605668fc2a6824295f18cabd88f91abd723bb12a6c95affd9b0cf92f83360" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.857738 4886 scope.go:117] "RemoveContainer" containerID="74e82094bd3d4b22c426fe08c8f80e64ea8c87015831dcc39602593851b4e016" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.880638 4886 scope.go:117] "RemoveContainer" containerID="f3c48b74802dceb2f9e91de28bc034fc5889de66fc2c84c993c2d754440871f0" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.899868 4886 scope.go:117] "RemoveContainer" containerID="5559e862a0e46514ce1527a1f0373e06468fa73279022e23a727ac1ba07c5de3" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.919723 4886 scope.go:117] "RemoveContainer" containerID="0456786fbe734adab210e7496efcdba51547157da386d8517e5fb3fede2a8e24" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.938405 4886 scope.go:117] "RemoveContainer" containerID="299e8007ec0ea2617cb03f2e9e5ec10edbfe3ca69ac94163cc4da94555aac5a6" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.954067 4886 scope.go:117] "RemoveContainer" containerID="3ba7363b5c1dda9754d6898345afb2bb662e9ecad34ee2f614d53bddb412141e" Mar 14 08:59:32 crc kubenswrapper[4886]: I0314 08:59:32.989568 4886 scope.go:117] "RemoveContainer" containerID="519858d0438c006aa8bb843bd113095c569f5380af74f59082ef80b372084f76" Mar 14 08:59:33 crc kubenswrapper[4886]: I0314 08:59:33.008660 4886 scope.go:117] "RemoveContainer" containerID="b33664b6d7ee1992fd148aa80d13d98502abcdea85b1431c2b1ef2a2fea883a7" Mar 14 08:59:33 crc kubenswrapper[4886]: I0314 08:59:33.026808 4886 scope.go:117] "RemoveContainer" containerID="0a4cfdd70492f8086340720be8ee6581d4a9cf019f2bdb84200efe061c2280a2" Mar 14 08:59:33 crc kubenswrapper[4886]: I0314 08:59:33.048013 4886 scope.go:117] "RemoveContainer" containerID="3a8dbbf717b472cf7fbeafd77663058b0413e2d4a420e4b1b2049a6a67ccbd8d" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.523431 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7ht2z"] Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.525801 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.553778 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7ht2z"] Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.597396 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8k4r\" (UniqueName: \"kubernetes.io/projected/bde3a6aa-b617-4daf-b088-de68d5ec9c23-kube-api-access-g8k4r\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.597582 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-catalog-content\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.597607 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-utilities\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.699329 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-catalog-content\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.699380 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-utilities\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.699503 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8k4r\" (UniqueName: \"kubernetes.io/projected/bde3a6aa-b617-4daf-b088-de68d5ec9c23-kube-api-access-g8k4r\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.699996 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-catalog-content\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.700035 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-utilities\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.719205 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8k4r\" (UniqueName: \"kubernetes.io/projected/bde3a6aa-b617-4daf-b088-de68d5ec9c23-kube-api-access-g8k4r\") pod \"community-operators-7ht2z\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:53 crc kubenswrapper[4886]: I0314 08:59:53.847193 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 08:59:54 crc kubenswrapper[4886]: I0314 08:59:54.436464 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7ht2z"] Mar 14 08:59:54 crc kubenswrapper[4886]: I0314 08:59:54.898388 4886 generic.go:334] "Generic (PLEG): container finished" podID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerID="d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9" exitCode=0 Mar 14 08:59:54 crc kubenswrapper[4886]: I0314 08:59:54.898492 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ht2z" event={"ID":"bde3a6aa-b617-4daf-b088-de68d5ec9c23","Type":"ContainerDied","Data":"d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9"} Mar 14 08:59:54 crc kubenswrapper[4886]: I0314 08:59:54.898711 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ht2z" event={"ID":"bde3a6aa-b617-4daf-b088-de68d5ec9c23","Type":"ContainerStarted","Data":"70683696017f0f9fd9c558ecb1f2d5249876b1478c9017059bf252337939d513"} Mar 14 08:59:56 crc kubenswrapper[4886]: I0314 08:59:56.921836 4886 generic.go:334] "Generic (PLEG): container finished" podID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerID="abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79" exitCode=0 Mar 14 08:59:56 crc kubenswrapper[4886]: I0314 08:59:56.921903 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ht2z" event={"ID":"bde3a6aa-b617-4daf-b088-de68d5ec9c23","Type":"ContainerDied","Data":"abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79"} Mar 14 08:59:57 crc kubenswrapper[4886]: I0314 08:59:57.934598 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ht2z" event={"ID":"bde3a6aa-b617-4daf-b088-de68d5ec9c23","Type":"ContainerStarted","Data":"990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab"} Mar 14 08:59:57 crc kubenswrapper[4886]: I0314 08:59:57.952624 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7ht2z" podStartSLOduration=2.527308609 podStartE2EDuration="4.952606558s" podCreationTimestamp="2026-03-14 08:59:53 +0000 UTC" firstStartedPulling="2026-03-14 08:59:54.900105016 +0000 UTC m=+1930.148556653" lastFinishedPulling="2026-03-14 08:59:57.325402965 +0000 UTC m=+1932.573854602" observedRunningTime="2026-03-14 08:59:57.950399875 +0000 UTC m=+1933.198851522" watchObservedRunningTime="2026-03-14 08:59:57.952606558 +0000 UTC m=+1933.201058195" Mar 14 08:59:58 crc kubenswrapper[4886]: I0314 08:59:58.951171 4886 generic.go:334] "Generic (PLEG): container finished" podID="d6cbe588-9aee-4554-b985-c809186e86d9" containerID="5d647916a372ae8d5ef8aad1fe292a39f539ce7f5e3e3eee10bf2735f979df8b" exitCode=0 Mar 14 08:59:58 crc kubenswrapper[4886]: I0314 08:59:58.951247 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" event={"ID":"d6cbe588-9aee-4554-b985-c809186e86d9","Type":"ContainerDied","Data":"5d647916a372ae8d5ef8aad1fe292a39f539ce7f5e3e3eee10bf2735f979df8b"} Mar 14 08:59:59 crc kubenswrapper[4886]: I0314 08:59:59.043256 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sbjqr"] Mar 14 08:59:59 crc kubenswrapper[4886]: I0314 08:59:59.051612 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sbjqr"] Mar 14 08:59:59 crc kubenswrapper[4886]: I0314 08:59:59.431387 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386f7c41-cb62-4ff1-bef7-11e4e8b14707" path="/var/lib/kubelet/pods/386f7c41-cb62-4ff1-bef7-11e4e8b14707/volumes" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.156393 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc"] Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.159048 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.162626 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.169283 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.179859 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557980-4tvv2"] Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.182250 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.191265 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.192631 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.193290 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.200434 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-4tvv2"] Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.226806 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc"] Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.250898 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-config-volume\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.251303 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsgq\" (UniqueName: \"kubernetes.io/projected/e154ca81-98b6-4a74-b011-41b224074571-kube-api-access-dfsgq\") pod \"auto-csr-approver-29557980-4tvv2\" (UID: \"e154ca81-98b6-4a74-b011-41b224074571\") " pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.251807 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-secret-volume\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.252040 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhqf\" (UniqueName: \"kubernetes.io/projected/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-kube-api-access-2zhqf\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.354269 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-config-volume\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.354331 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsgq\" (UniqueName: \"kubernetes.io/projected/e154ca81-98b6-4a74-b011-41b224074571-kube-api-access-dfsgq\") pod \"auto-csr-approver-29557980-4tvv2\" (UID: \"e154ca81-98b6-4a74-b011-41b224074571\") " pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.354461 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-secret-volume\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.354510 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhqf\" (UniqueName: \"kubernetes.io/projected/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-kube-api-access-2zhqf\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.356399 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-config-volume\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.369029 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-secret-volume\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.374141 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsgq\" (UniqueName: \"kubernetes.io/projected/e154ca81-98b6-4a74-b011-41b224074571-kube-api-access-dfsgq\") pod \"auto-csr-approver-29557980-4tvv2\" (UID: \"e154ca81-98b6-4a74-b011-41b224074571\") " pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.377961 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhqf\" (UniqueName: \"kubernetes.io/projected/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-kube-api-access-2zhqf\") pod \"collect-profiles-29557980-6cddc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.465856 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.495744 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.536780 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.561266 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-inventory\") pod \"d6cbe588-9aee-4554-b985-c809186e86d9\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.565340 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbtfk\" (UniqueName: \"kubernetes.io/projected/d6cbe588-9aee-4554-b985-c809186e86d9-kube-api-access-hbtfk\") pod \"d6cbe588-9aee-4554-b985-c809186e86d9\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.565405 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-ssh-key-openstack-edpm-ipam\") pod \"d6cbe588-9aee-4554-b985-c809186e86d9\" (UID: \"d6cbe588-9aee-4554-b985-c809186e86d9\") " Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.574417 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cbe588-9aee-4554-b985-c809186e86d9-kube-api-access-hbtfk" (OuterVolumeSpecName: "kube-api-access-hbtfk") pod "d6cbe588-9aee-4554-b985-c809186e86d9" (UID: "d6cbe588-9aee-4554-b985-c809186e86d9"). InnerVolumeSpecName "kube-api-access-hbtfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.597692 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-inventory" (OuterVolumeSpecName: "inventory") pod "d6cbe588-9aee-4554-b985-c809186e86d9" (UID: "d6cbe588-9aee-4554-b985-c809186e86d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.599778 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d6cbe588-9aee-4554-b985-c809186e86d9" (UID: "d6cbe588-9aee-4554-b985-c809186e86d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.669244 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbtfk\" (UniqueName: \"kubernetes.io/projected/d6cbe588-9aee-4554-b985-c809186e86d9-kube-api-access-hbtfk\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.669301 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.669323 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6cbe588-9aee-4554-b985-c809186e86d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.976795 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" event={"ID":"d6cbe588-9aee-4554-b985-c809186e86d9","Type":"ContainerDied","Data":"b2b0c29c4b3a2df4497e4b16e941ae6b0b03a0d268073e6202438c5688a9e87a"} Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.976847 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2b0c29c4b3a2df4497e4b16e941ae6b0b03a0d268073e6202438c5688a9e87a" Mar 14 09:00:00 crc kubenswrapper[4886]: I0314 09:00:00.976985 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.045222 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc"] Mar 14 09:00:01 crc kubenswrapper[4886]: W0314 09:00:01.049687 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode154ca81_98b6_4a74_b011_41b224074571.slice/crio-a5458a0657a357ca30f590eac2c887d818745ef5d503979703b1701c1570f7f2 WatchSource:0}: Error finding container a5458a0657a357ca30f590eac2c887d818745ef5d503979703b1701c1570f7f2: Status 404 returned error can't find the container with id a5458a0657a357ca30f590eac2c887d818745ef5d503979703b1701c1570f7f2 Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.056480 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-4tvv2"] Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.093738 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4"] Mar 14 09:00:01 crc kubenswrapper[4886]: E0314 09:00:01.094382 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cbe588-9aee-4554-b985-c809186e86d9" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.094406 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cbe588-9aee-4554-b985-c809186e86d9" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.094744 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cbe588-9aee-4554-b985-c809186e86d9" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.095824 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.099761 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.099996 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.100360 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.100564 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.121080 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4"] Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.179066 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.179532 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srwc\" (UniqueName: \"kubernetes.io/projected/e26f4f17-b548-45cf-8781-058e7b1787d0-kube-api-access-5srwc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.179598 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.284991 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.285190 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srwc\" (UniqueName: \"kubernetes.io/projected/e26f4f17-b548-45cf-8781-058e7b1787d0-kube-api-access-5srwc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.285275 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.293955 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.294268 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.305155 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srwc\" (UniqueName: \"kubernetes.io/projected/e26f4f17-b548-45cf-8781-058e7b1787d0-kube-api-access-5srwc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.434769 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.987302 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" event={"ID":"e154ca81-98b6-4a74-b011-41b224074571","Type":"ContainerStarted","Data":"a5458a0657a357ca30f590eac2c887d818745ef5d503979703b1701c1570f7f2"} Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.989029 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" event={"ID":"e3c9a265-82e9-4726-bda9-f6c6111dc1dc","Type":"ContainerStarted","Data":"973350fda5424a10b0f4c4bb8ac5973c1b6c73e577bfc8465011f28d3db4f4f6"} Mar 14 09:00:01 crc kubenswrapper[4886]: I0314 09:00:01.989202 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" event={"ID":"e3c9a265-82e9-4726-bda9-f6c6111dc1dc","Type":"ContainerStarted","Data":"8374d3261a14f0e64978c8f56d1d74dfbc25715396af66c83c221b2967f0afd2"} Mar 14 09:00:02 crc kubenswrapper[4886]: I0314 09:00:02.012953 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" podStartSLOduration=2.012922242 podStartE2EDuration="2.012922242s" podCreationTimestamp="2026-03-14 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:00:02.011410418 +0000 UTC m=+1937.259862055" watchObservedRunningTime="2026-03-14 09:00:02.012922242 +0000 UTC m=+1937.261373899" Mar 14 09:00:02 crc kubenswrapper[4886]: I0314 09:00:02.218283 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4"] Mar 14 09:00:02 crc kubenswrapper[4886]: I0314 09:00:02.999138 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" event={"ID":"e26f4f17-b548-45cf-8781-058e7b1787d0","Type":"ContainerStarted","Data":"a2a0587301a70b68877ba59a7f8cd424ebe553ffaaf9497ed8b809f61cb5dee5"} Mar 14 09:00:03 crc kubenswrapper[4886]: I0314 09:00:03.001030 4886 generic.go:334] "Generic (PLEG): container finished" podID="e3c9a265-82e9-4726-bda9-f6c6111dc1dc" containerID="973350fda5424a10b0f4c4bb8ac5973c1b6c73e577bfc8465011f28d3db4f4f6" exitCode=0 Mar 14 09:00:03 crc kubenswrapper[4886]: I0314 09:00:03.001061 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" event={"ID":"e3c9a265-82e9-4726-bda9-f6c6111dc1dc","Type":"ContainerDied","Data":"973350fda5424a10b0f4c4bb8ac5973c1b6c73e577bfc8465011f28d3db4f4f6"} Mar 14 09:00:03 crc kubenswrapper[4886]: I0314 09:00:03.847844 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 09:00:03 crc kubenswrapper[4886]: I0314 09:00:03.848298 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 09:00:03 crc kubenswrapper[4886]: I0314 09:00:03.898608 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.018445 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" event={"ID":"e26f4f17-b548-45cf-8781-058e7b1787d0","Type":"ContainerStarted","Data":"f2e56cd0b8a2a4e4f17441d13c7f57b200a34f0877049ee4b55d77b3ec8dce75"} Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.047799 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" podStartSLOduration=2.435926713 podStartE2EDuration="3.047780108s" podCreationTimestamp="2026-03-14 09:00:01 +0000 UTC" firstStartedPulling="2026-03-14 09:00:02.224684099 +0000 UTC m=+1937.473135736" lastFinishedPulling="2026-03-14 09:00:02.836537494 +0000 UTC m=+1938.084989131" observedRunningTime="2026-03-14 09:00:04.036423595 +0000 UTC m=+1939.284875232" watchObservedRunningTime="2026-03-14 09:00:04.047780108 +0000 UTC m=+1939.296231745" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.074801 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.140639 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7ht2z"] Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.383986 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.473669 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zhqf\" (UniqueName: \"kubernetes.io/projected/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-kube-api-access-2zhqf\") pod \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.473880 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-secret-volume\") pod \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.474025 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-config-volume\") pod \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\" (UID: \"e3c9a265-82e9-4726-bda9-f6c6111dc1dc\") " Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.475638 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3c9a265-82e9-4726-bda9-f6c6111dc1dc" (UID: "e3c9a265-82e9-4726-bda9-f6c6111dc1dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.481534 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3c9a265-82e9-4726-bda9-f6c6111dc1dc" (UID: "e3c9a265-82e9-4726-bda9-f6c6111dc1dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.483453 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-kube-api-access-2zhqf" (OuterVolumeSpecName: "kube-api-access-2zhqf") pod "e3c9a265-82e9-4726-bda9-f6c6111dc1dc" (UID: "e3c9a265-82e9-4726-bda9-f6c6111dc1dc"). InnerVolumeSpecName "kube-api-access-2zhqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.576856 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zhqf\" (UniqueName: \"kubernetes.io/projected/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-kube-api-access-2zhqf\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.576899 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:04 crc kubenswrapper[4886]: I0314 09:00:04.576913 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3c9a265-82e9-4726-bda9-f6c6111dc1dc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:05 crc kubenswrapper[4886]: I0314 09:00:05.028611 4886 generic.go:334] "Generic (PLEG): container finished" podID="e154ca81-98b6-4a74-b011-41b224074571" containerID="f50949f54ab90687395fb8da3a725d15ae54c40be266f774fb0925981e9b53a6" exitCode=0 Mar 14 09:00:05 crc kubenswrapper[4886]: I0314 09:00:05.028669 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" event={"ID":"e154ca81-98b6-4a74-b011-41b224074571","Type":"ContainerDied","Data":"f50949f54ab90687395fb8da3a725d15ae54c40be266f774fb0925981e9b53a6"} Mar 14 09:00:05 crc kubenswrapper[4886]: I0314 09:00:05.031767 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" event={"ID":"e3c9a265-82e9-4726-bda9-f6c6111dc1dc","Type":"ContainerDied","Data":"8374d3261a14f0e64978c8f56d1d74dfbc25715396af66c83c221b2967f0afd2"} Mar 14 09:00:05 crc kubenswrapper[4886]: I0314 09:00:05.031799 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8374d3261a14f0e64978c8f56d1d74dfbc25715396af66c83c221b2967f0afd2" Mar 14 09:00:05 crc kubenswrapper[4886]: I0314 09:00:05.031852 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.048882 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7ht2z" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="registry-server" containerID="cri-o://990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab" gracePeriod=2 Mar 14 09:00:06 crc kubenswrapper[4886]: E0314 09:00:06.296286 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde3a6aa_b617_4daf_b088_de68d5ec9c23.slice/crio-conmon-990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde3a6aa_b617_4daf_b088_de68d5ec9c23.slice/crio-990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.431733 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.521140 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfsgq\" (UniqueName: \"kubernetes.io/projected/e154ca81-98b6-4a74-b011-41b224074571-kube-api-access-dfsgq\") pod \"e154ca81-98b6-4a74-b011-41b224074571\" (UID: \"e154ca81-98b6-4a74-b011-41b224074571\") " Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.579347 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e154ca81-98b6-4a74-b011-41b224074571-kube-api-access-dfsgq" (OuterVolumeSpecName: "kube-api-access-dfsgq") pod "e154ca81-98b6-4a74-b011-41b224074571" (UID: "e154ca81-98b6-4a74-b011-41b224074571"). InnerVolumeSpecName "kube-api-access-dfsgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.624334 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfsgq\" (UniqueName: \"kubernetes.io/projected/e154ca81-98b6-4a74-b011-41b224074571-kube-api-access-dfsgq\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.683849 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.726259 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8k4r\" (UniqueName: \"kubernetes.io/projected/bde3a6aa-b617-4daf-b088-de68d5ec9c23-kube-api-access-g8k4r\") pod \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.726297 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-catalog-content\") pod \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.726349 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-utilities\") pod \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\" (UID: \"bde3a6aa-b617-4daf-b088-de68d5ec9c23\") " Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.727876 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-utilities" (OuterVolumeSpecName: "utilities") pod "bde3a6aa-b617-4daf-b088-de68d5ec9c23" (UID: "bde3a6aa-b617-4daf-b088-de68d5ec9c23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.735392 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde3a6aa-b617-4daf-b088-de68d5ec9c23-kube-api-access-g8k4r" (OuterVolumeSpecName: "kube-api-access-g8k4r") pod "bde3a6aa-b617-4daf-b088-de68d5ec9c23" (UID: "bde3a6aa-b617-4daf-b088-de68d5ec9c23"). InnerVolumeSpecName "kube-api-access-g8k4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.784339 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bde3a6aa-b617-4daf-b088-de68d5ec9c23" (UID: "bde3a6aa-b617-4daf-b088-de68d5ec9c23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.829056 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8k4r\" (UniqueName: \"kubernetes.io/projected/bde3a6aa-b617-4daf-b088-de68d5ec9c23-kube-api-access-g8k4r\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.829091 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:06 crc kubenswrapper[4886]: I0314 09:00:06.829103 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde3a6aa-b617-4daf-b088-de68d5ec9c23-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.058291 4886 generic.go:334] "Generic (PLEG): container finished" podID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerID="990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab" exitCode=0 Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.058343 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ht2z" event={"ID":"bde3a6aa-b617-4daf-b088-de68d5ec9c23","Type":"ContainerDied","Data":"990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab"} Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.058377 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7ht2z" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.058406 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7ht2z" event={"ID":"bde3a6aa-b617-4daf-b088-de68d5ec9c23","Type":"ContainerDied","Data":"70683696017f0f9fd9c558ecb1f2d5249876b1478c9017059bf252337939d513"} Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.058428 4886 scope.go:117] "RemoveContainer" containerID="990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.059789 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" event={"ID":"e154ca81-98b6-4a74-b011-41b224074571","Type":"ContainerDied","Data":"a5458a0657a357ca30f590eac2c887d818745ef5d503979703b1701c1570f7f2"} Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.059809 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5458a0657a357ca30f590eac2c887d818745ef5d503979703b1701c1570f7f2" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.059890 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557980-4tvv2" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.077880 4886 scope.go:117] "RemoveContainer" containerID="abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.097174 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7ht2z"] Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.104737 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7ht2z"] Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.118695 4886 scope.go:117] "RemoveContainer" containerID="d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.138073 4886 scope.go:117] "RemoveContainer" containerID="990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab" Mar 14 09:00:07 crc kubenswrapper[4886]: E0314 09:00:07.138499 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab\": container with ID starting with 990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab not found: ID does not exist" containerID="990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.138530 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab"} err="failed to get container status \"990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab\": rpc error: code = NotFound desc = could not find container \"990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab\": container with ID starting with 990716f4663328d952536e11ff51778b49a28bf04250f35641eb70dd4d3305ab not found: ID does not exist" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.138552 4886 scope.go:117] "RemoveContainer" containerID="abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79" Mar 14 09:00:07 crc kubenswrapper[4886]: E0314 09:00:07.138819 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79\": container with ID starting with abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79 not found: ID does not exist" containerID="abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.138839 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79"} err="failed to get container status \"abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79\": rpc error: code = NotFound desc = could not find container \"abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79\": container with ID starting with abd8c5d6de9385c027c46d59c48a1b02c03caa1740b5dfefb71efa035c89ed79 not found: ID does not exist" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.138851 4886 scope.go:117] "RemoveContainer" containerID="d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9" Mar 14 09:00:07 crc kubenswrapper[4886]: E0314 09:00:07.139267 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9\": container with ID starting with d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9 not found: ID does not exist" containerID="d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.139318 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9"} err="failed to get container status \"d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9\": rpc error: code = NotFound desc = could not find container \"d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9\": container with ID starting with d067f9108600ac3ce29d1ed4a15ebed8d205ae14ae8854421db53e075835d0e9 not found: ID does not exist" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.432775 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" path="/var/lib/kubelet/pods/bde3a6aa-b617-4daf-b088-de68d5ec9c23/volumes" Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.523944 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-947z8"] Mar 14 09:00:07 crc kubenswrapper[4886]: I0314 09:00:07.536823 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557974-947z8"] Mar 14 09:00:09 crc kubenswrapper[4886]: I0314 09:00:09.432586 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1282d9-39e2-4cda-9431-a984056855f2" path="/var/lib/kubelet/pods/6f1282d9-39e2-4cda-9431-a984056855f2/volumes" Mar 14 09:00:14 crc kubenswrapper[4886]: I0314 09:00:14.037081 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ctl2l"] Mar 14 09:00:14 crc kubenswrapper[4886]: I0314 09:00:14.045869 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vwjh5"] Mar 14 09:00:14 crc kubenswrapper[4886]: I0314 09:00:14.054349 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8wtzc"] Mar 14 09:00:14 crc kubenswrapper[4886]: I0314 09:00:14.062970 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vwjh5"] Mar 14 09:00:14 crc kubenswrapper[4886]: I0314 09:00:14.070819 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ctl2l"] Mar 14 09:00:14 crc kubenswrapper[4886]: I0314 09:00:14.078886 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8wtzc"] Mar 14 09:00:15 crc kubenswrapper[4886]: I0314 09:00:15.430357 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d82e0e-40f0-45e0-b92f-df553a24bc5b" path="/var/lib/kubelet/pods/27d82e0e-40f0-45e0-b92f-df553a24bc5b/volumes" Mar 14 09:00:15 crc kubenswrapper[4886]: I0314 09:00:15.431287 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f258e4-6012-4807-95a6-cce9ee5af3d8" path="/var/lib/kubelet/pods/29f258e4-6012-4807-95a6-cce9ee5af3d8/volumes" Mar 14 09:00:15 crc kubenswrapper[4886]: I0314 09:00:15.431979 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e521aeb3-adb2-4042-ad11-33d749d5506b" path="/var/lib/kubelet/pods/e521aeb3-adb2-4042-ad11-33d749d5506b/volumes" Mar 14 09:00:30 crc kubenswrapper[4886]: I0314 09:00:30.039726 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6frh2"] Mar 14 09:00:30 crc kubenswrapper[4886]: I0314 09:00:30.048013 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6frh2"] Mar 14 09:00:31 crc kubenswrapper[4886]: I0314 09:00:31.437939 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14718224-eaad-4caf-b13b-a60a9c2a9460" path="/var/lib/kubelet/pods/14718224-eaad-4caf-b13b-a60a9c2a9460/volumes" Mar 14 09:00:33 crc kubenswrapper[4886]: I0314 09:00:33.333727 4886 scope.go:117] "RemoveContainer" containerID="8e54e288669ae4cdf21748f2471236e2b5b8a8da57d9ac2835880115c4de79e7" Mar 14 09:00:33 crc kubenswrapper[4886]: I0314 09:00:33.373274 4886 scope.go:117] "RemoveContainer" containerID="91110068c4fd7023df65e5b9e10f4ea94405c81cb5051853db689bc531c52836" Mar 14 09:00:33 crc kubenswrapper[4886]: I0314 09:00:33.435762 4886 scope.go:117] "RemoveContainer" containerID="5bbd87714301a05e3600057a773994a2e440bb56d6be08fed15027ac487ffe88" Mar 14 09:00:33 crc kubenswrapper[4886]: I0314 09:00:33.472183 4886 scope.go:117] "RemoveContainer" containerID="954f68258749ef53aae9d9c9abb0b484f61d0fb7c2d1497060257441212f0217" Mar 14 09:00:33 crc kubenswrapper[4886]: I0314 09:00:33.547931 4886 scope.go:117] "RemoveContainer" containerID="10d580258be851ff0edd037a92b42e6ce275f0593a0f37fac83cb417b50e058f" Mar 14 09:00:33 crc kubenswrapper[4886]: I0314 09:00:33.578086 4886 scope.go:117] "RemoveContainer" containerID="f09c8a9767fae0dfb3082349002eaeac96b929963e09bcf10f2f0d8effd2be84" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.150730 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557981-w64rr"] Mar 14 09:01:00 crc kubenswrapper[4886]: E0314 09:01:00.151736 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c9a265-82e9-4726-bda9-f6c6111dc1dc" containerName="collect-profiles" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.151751 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c9a265-82e9-4726-bda9-f6c6111dc1dc" containerName="collect-profiles" Mar 14 09:01:00 crc kubenswrapper[4886]: E0314 09:01:00.151767 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="extract-content" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.151774 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="extract-content" Mar 14 09:01:00 crc kubenswrapper[4886]: E0314 09:01:00.151794 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="extract-utilities" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.151804 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="extract-utilities" Mar 14 09:01:00 crc kubenswrapper[4886]: E0314 09:01:00.151834 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="registry-server" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.151842 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="registry-server" Mar 14 09:01:00 crc kubenswrapper[4886]: E0314 09:01:00.151856 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e154ca81-98b6-4a74-b011-41b224074571" containerName="oc" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.151863 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e154ca81-98b6-4a74-b011-41b224074571" containerName="oc" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.152085 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde3a6aa-b617-4daf-b088-de68d5ec9c23" containerName="registry-server" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.152101 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e154ca81-98b6-4a74-b011-41b224074571" containerName="oc" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.152143 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c9a265-82e9-4726-bda9-f6c6111dc1dc" containerName="collect-profiles" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.152919 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.171965 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557981-w64rr"] Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.296072 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-fernet-keys\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.296210 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6wp\" (UniqueName: \"kubernetes.io/projected/942975ae-1af3-4e59-b73f-7cd6246a5f7e-kube-api-access-sq6wp\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.296397 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-config-data\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.296666 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-combined-ca-bundle\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.399056 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-fernet-keys\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.399181 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6wp\" (UniqueName: \"kubernetes.io/projected/942975ae-1af3-4e59-b73f-7cd6246a5f7e-kube-api-access-sq6wp\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.399357 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-config-data\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.399472 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-combined-ca-bundle\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.406351 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-fernet-keys\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.407296 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-config-data\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.419036 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-combined-ca-bundle\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.419607 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6wp\" (UniqueName: \"kubernetes.io/projected/942975ae-1af3-4e59-b73f-7cd6246a5f7e-kube-api-access-sq6wp\") pod \"keystone-cron-29557981-w64rr\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.477480 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:00 crc kubenswrapper[4886]: I0314 09:01:00.973767 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557981-w64rr"] Mar 14 09:01:01 crc kubenswrapper[4886]: I0314 09:01:01.560619 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-w64rr" event={"ID":"942975ae-1af3-4e59-b73f-7cd6246a5f7e","Type":"ContainerStarted","Data":"62b997d13741c8cbde8fbd8cd0cab10d9c966d3025d8b972e433f476413ac9e1"} Mar 14 09:01:01 crc kubenswrapper[4886]: I0314 09:01:01.560862 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-w64rr" event={"ID":"942975ae-1af3-4e59-b73f-7cd6246a5f7e","Type":"ContainerStarted","Data":"5d587ec82d79d457f8e7a1d7d011f77cba4f567379172afd75da8ab97abca7c6"} Mar 14 09:01:01 crc kubenswrapper[4886]: I0314 09:01:01.581052 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557981-w64rr" podStartSLOduration=1.581029867 podStartE2EDuration="1.581029867s" podCreationTimestamp="2026-03-14 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:01:01.575667084 +0000 UTC m=+1996.824118731" watchObservedRunningTime="2026-03-14 09:01:01.581029867 +0000 UTC m=+1996.829481504" Mar 14 09:01:03 crc kubenswrapper[4886]: I0314 09:01:03.580692 4886 generic.go:334] "Generic (PLEG): container finished" podID="942975ae-1af3-4e59-b73f-7cd6246a5f7e" containerID="62b997d13741c8cbde8fbd8cd0cab10d9c966d3025d8b972e433f476413ac9e1" exitCode=0 Mar 14 09:01:03 crc kubenswrapper[4886]: I0314 09:01:03.580844 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-w64rr" event={"ID":"942975ae-1af3-4e59-b73f-7cd6246a5f7e","Type":"ContainerDied","Data":"62b997d13741c8cbde8fbd8cd0cab10d9c966d3025d8b972e433f476413ac9e1"} Mar 14 09:01:04 crc kubenswrapper[4886]: I0314 09:01:04.894654 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:04 crc kubenswrapper[4886]: I0314 09:01:04.989885 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6wp\" (UniqueName: \"kubernetes.io/projected/942975ae-1af3-4e59-b73f-7cd6246a5f7e-kube-api-access-sq6wp\") pod \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " Mar 14 09:01:04 crc kubenswrapper[4886]: I0314 09:01:04.990114 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-config-data\") pod \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " Mar 14 09:01:04 crc kubenswrapper[4886]: I0314 09:01:04.990242 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-combined-ca-bundle\") pod \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " Mar 14 09:01:04 crc kubenswrapper[4886]: I0314 09:01:04.990342 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-fernet-keys\") pod \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\" (UID: \"942975ae-1af3-4e59-b73f-7cd6246a5f7e\") " Mar 14 09:01:04 crc kubenswrapper[4886]: I0314 09:01:04.999204 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942975ae-1af3-4e59-b73f-7cd6246a5f7e-kube-api-access-sq6wp" (OuterVolumeSpecName: "kube-api-access-sq6wp") pod "942975ae-1af3-4e59-b73f-7cd6246a5f7e" (UID: "942975ae-1af3-4e59-b73f-7cd6246a5f7e"). InnerVolumeSpecName "kube-api-access-sq6wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.000608 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "942975ae-1af3-4e59-b73f-7cd6246a5f7e" (UID: "942975ae-1af3-4e59-b73f-7cd6246a5f7e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.032579 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "942975ae-1af3-4e59-b73f-7cd6246a5f7e" (UID: "942975ae-1af3-4e59-b73f-7cd6246a5f7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.055018 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-config-data" (OuterVolumeSpecName: "config-data") pod "942975ae-1af3-4e59-b73f-7cd6246a5f7e" (UID: "942975ae-1af3-4e59-b73f-7cd6246a5f7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.092516 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.092554 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.092567 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6wp\" (UniqueName: \"kubernetes.io/projected/942975ae-1af3-4e59-b73f-7cd6246a5f7e-kube-api-access-sq6wp\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.092578 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/942975ae-1af3-4e59-b73f-7cd6246a5f7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.604449 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557981-w64rr" event={"ID":"942975ae-1af3-4e59-b73f-7cd6246a5f7e","Type":"ContainerDied","Data":"5d587ec82d79d457f8e7a1d7d011f77cba4f567379172afd75da8ab97abca7c6"} Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.604490 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d587ec82d79d457f8e7a1d7d011f77cba4f567379172afd75da8ab97abca7c6" Mar 14 09:01:05 crc kubenswrapper[4886]: I0314 09:01:05.604548 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557981-w64rr" Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.044568 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xwt5m"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.057854 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-w8c9q"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.066251 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8303-account-create-update-w2kk2"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.076880 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-dwk2t"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.084796 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xwt5m"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.110371 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-w8c9q"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.123281 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8303-account-create-update-w2kk2"] Mar 14 09:01:06 crc kubenswrapper[4886]: I0314 09:01:06.135181 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-dwk2t"] Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.030939 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4e72-account-create-update-tpngn"] Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.040585 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-113e-account-create-update-mjr4j"] Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.049916 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4e72-account-create-update-tpngn"] Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.060024 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-113e-account-create-update-mjr4j"] Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.432763 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0858eab0-1359-49c7-89eb-fe94498572dc" path="/var/lib/kubelet/pods/0858eab0-1359-49c7-89eb-fe94498572dc/volumes" Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.433594 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="639641ff-f838-43ec-bc7a-c8313a5dc254" path="/var/lib/kubelet/pods/639641ff-f838-43ec-bc7a-c8313a5dc254/volumes" Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.434141 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874c5fb0-7bf3-46a1-9be1-71bc8b49cb38" path="/var/lib/kubelet/pods/874c5fb0-7bf3-46a1-9be1-71bc8b49cb38/volumes" Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.434679 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dae5e3-c486-4f68-bc83-54cda54cd52b" path="/var/lib/kubelet/pods/c7dae5e3-c486-4f68-bc83-54cda54cd52b/volumes" Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.435723 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21173e4-e8f0-46b1-8c84-0453259409aa" path="/var/lib/kubelet/pods/f21173e4-e8f0-46b1-8c84-0453259409aa/volumes" Mar 14 09:01:07 crc kubenswrapper[4886]: I0314 09:01:07.436232 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdaed1d0-b39b-4d77-8b14-4b9cb1de478a" path="/var/lib/kubelet/pods/fdaed1d0-b39b-4d77-8b14-4b9cb1de478a/volumes" Mar 14 09:01:17 crc kubenswrapper[4886]: I0314 09:01:17.721074 4886 generic.go:334] "Generic (PLEG): container finished" podID="e26f4f17-b548-45cf-8781-058e7b1787d0" containerID="f2e56cd0b8a2a4e4f17441d13c7f57b200a34f0877049ee4b55d77b3ec8dce75" exitCode=0 Mar 14 09:01:17 crc kubenswrapper[4886]: I0314 09:01:17.721151 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" event={"ID":"e26f4f17-b548-45cf-8781-058e7b1787d0","Type":"ContainerDied","Data":"f2e56cd0b8a2a4e4f17441d13c7f57b200a34f0877049ee4b55d77b3ec8dce75"} Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.128793 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.287293 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5srwc\" (UniqueName: \"kubernetes.io/projected/e26f4f17-b548-45cf-8781-058e7b1787d0-kube-api-access-5srwc\") pod \"e26f4f17-b548-45cf-8781-058e7b1787d0\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.287474 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-ssh-key-openstack-edpm-ipam\") pod \"e26f4f17-b548-45cf-8781-058e7b1787d0\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.288286 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory\") pod \"e26f4f17-b548-45cf-8781-058e7b1787d0\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.292943 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26f4f17-b548-45cf-8781-058e7b1787d0-kube-api-access-5srwc" (OuterVolumeSpecName: "kube-api-access-5srwc") pod "e26f4f17-b548-45cf-8781-058e7b1787d0" (UID: "e26f4f17-b548-45cf-8781-058e7b1787d0"). InnerVolumeSpecName "kube-api-access-5srwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:19 crc kubenswrapper[4886]: E0314 09:01:19.313509 4886 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory podName:e26f4f17-b548-45cf-8781-058e7b1787d0 nodeName:}" failed. No retries permitted until 2026-03-14 09:01:19.813472891 +0000 UTC m=+2015.061924518 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory") pod "e26f4f17-b548-45cf-8781-058e7b1787d0" (UID: "e26f4f17-b548-45cf-8781-058e7b1787d0") : error deleting /var/lib/kubelet/pods/e26f4f17-b548-45cf-8781-058e7b1787d0/volume-subpaths: remove /var/lib/kubelet/pods/e26f4f17-b548-45cf-8781-058e7b1787d0/volume-subpaths: no such file or directory Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.316342 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e26f4f17-b548-45cf-8781-058e7b1787d0" (UID: "e26f4f17-b548-45cf-8781-058e7b1787d0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.390412 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5srwc\" (UniqueName: \"kubernetes.io/projected/e26f4f17-b548-45cf-8781-058e7b1787d0-kube-api-access-5srwc\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.390454 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.751048 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" event={"ID":"e26f4f17-b548-45cf-8781-058e7b1787d0","Type":"ContainerDied","Data":"a2a0587301a70b68877ba59a7f8cd424ebe553ffaaf9497ed8b809f61cb5dee5"} Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.751334 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2a0587301a70b68877ba59a7f8cd424ebe553ffaaf9497ed8b809f61cb5dee5" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.751107 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.821610 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc"] Mar 14 09:01:19 crc kubenswrapper[4886]: E0314 09:01:19.822084 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="942975ae-1af3-4e59-b73f-7cd6246a5f7e" containerName="keystone-cron" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.822101 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="942975ae-1af3-4e59-b73f-7cd6246a5f7e" containerName="keystone-cron" Mar 14 09:01:19 crc kubenswrapper[4886]: E0314 09:01:19.822149 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26f4f17-b548-45cf-8781-058e7b1787d0" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.822156 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26f4f17-b548-45cf-8781-058e7b1787d0" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.822339 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26f4f17-b548-45cf-8781-058e7b1787d0" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.822359 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="942975ae-1af3-4e59-b73f-7cd6246a5f7e" containerName="keystone-cron" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.823073 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.835306 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc"] Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.898766 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory\") pod \"e26f4f17-b548-45cf-8781-058e7b1787d0\" (UID: \"e26f4f17-b548-45cf-8781-058e7b1787d0\") " Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.899757 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.899919 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.900110 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrss\" (UniqueName: \"kubernetes.io/projected/813f04db-4c33-4db3-a81a-a5617d8d460f-kube-api-access-llrss\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:19 crc kubenswrapper[4886]: I0314 09:01:19.908709 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory" (OuterVolumeSpecName: "inventory") pod "e26f4f17-b548-45cf-8781-058e7b1787d0" (UID: "e26f4f17-b548-45cf-8781-058e7b1787d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.002015 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.002092 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.002194 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrss\" (UniqueName: \"kubernetes.io/projected/813f04db-4c33-4db3-a81a-a5617d8d460f-kube-api-access-llrss\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.002253 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e26f4f17-b548-45cf-8781-058e7b1787d0-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.005489 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.006250 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.018063 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrss\" (UniqueName: \"kubernetes.io/projected/813f04db-4c33-4db3-a81a-a5617d8d460f-kube-api-access-llrss\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.139399 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.625686 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc"] Mar 14 09:01:20 crc kubenswrapper[4886]: I0314 09:01:20.760295 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" event={"ID":"813f04db-4c33-4db3-a81a-a5617d8d460f","Type":"ContainerStarted","Data":"9fdaadd6eeb654c114cbf9a122bce03a635903f651878004499ed7d188e77fe2"} Mar 14 09:01:21 crc kubenswrapper[4886]: I0314 09:01:21.770052 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" event={"ID":"813f04db-4c33-4db3-a81a-a5617d8d460f","Type":"ContainerStarted","Data":"ec75a73fd887465e8e59f20a5ec832fdc3bd8f0524df7f43455aaab95fbf8a35"} Mar 14 09:01:21 crc kubenswrapper[4886]: I0314 09:01:21.796438 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" podStartSLOduration=2.407622548 podStartE2EDuration="2.796416692s" podCreationTimestamp="2026-03-14 09:01:19 +0000 UTC" firstStartedPulling="2026-03-14 09:01:20.628382131 +0000 UTC m=+2015.876833758" lastFinishedPulling="2026-03-14 09:01:21.017176265 +0000 UTC m=+2016.265627902" observedRunningTime="2026-03-14 09:01:21.789966599 +0000 UTC m=+2017.038418256" watchObservedRunningTime="2026-03-14 09:01:21.796416692 +0000 UTC m=+2017.044868319" Mar 14 09:01:26 crc kubenswrapper[4886]: I0314 09:01:26.066432 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:01:26 crc kubenswrapper[4886]: I0314 09:01:26.067096 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:01:26 crc kubenswrapper[4886]: I0314 09:01:26.820014 4886 generic.go:334] "Generic (PLEG): container finished" podID="813f04db-4c33-4db3-a81a-a5617d8d460f" containerID="ec75a73fd887465e8e59f20a5ec832fdc3bd8f0524df7f43455aaab95fbf8a35" exitCode=0 Mar 14 09:01:26 crc kubenswrapper[4886]: I0314 09:01:26.820068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" event={"ID":"813f04db-4c33-4db3-a81a-a5617d8d460f","Type":"ContainerDied","Data":"ec75a73fd887465e8e59f20a5ec832fdc3bd8f0524df7f43455aaab95fbf8a35"} Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.323393 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.389303 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-inventory\") pod \"813f04db-4c33-4db3-a81a-a5617d8d460f\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.428480 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-inventory" (OuterVolumeSpecName: "inventory") pod "813f04db-4c33-4db3-a81a-a5617d8d460f" (UID: "813f04db-4c33-4db3-a81a-a5617d8d460f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.495220 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-ssh-key-openstack-edpm-ipam\") pod \"813f04db-4c33-4db3-a81a-a5617d8d460f\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.495435 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llrss\" (UniqueName: \"kubernetes.io/projected/813f04db-4c33-4db3-a81a-a5617d8d460f-kube-api-access-llrss\") pod \"813f04db-4c33-4db3-a81a-a5617d8d460f\" (UID: \"813f04db-4c33-4db3-a81a-a5617d8d460f\") " Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.495952 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.500433 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813f04db-4c33-4db3-a81a-a5617d8d460f-kube-api-access-llrss" (OuterVolumeSpecName: "kube-api-access-llrss") pod "813f04db-4c33-4db3-a81a-a5617d8d460f" (UID: "813f04db-4c33-4db3-a81a-a5617d8d460f"). InnerVolumeSpecName "kube-api-access-llrss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.525306 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "813f04db-4c33-4db3-a81a-a5617d8d460f" (UID: "813f04db-4c33-4db3-a81a-a5617d8d460f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.597662 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/813f04db-4c33-4db3-a81a-a5617d8d460f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.597704 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llrss\" (UniqueName: \"kubernetes.io/projected/813f04db-4c33-4db3-a81a-a5617d8d460f-kube-api-access-llrss\") on node \"crc\" DevicePath \"\"" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.847208 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" event={"ID":"813f04db-4c33-4db3-a81a-a5617d8d460f","Type":"ContainerDied","Data":"9fdaadd6eeb654c114cbf9a122bce03a635903f651878004499ed7d188e77fe2"} Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.847714 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fdaadd6eeb654c114cbf9a122bce03a635903f651878004499ed7d188e77fe2" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.847362 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.926222 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x"] Mar 14 09:01:28 crc kubenswrapper[4886]: E0314 09:01:28.927010 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813f04db-4c33-4db3-a81a-a5617d8d460f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.927046 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="813f04db-4c33-4db3-a81a-a5617d8d460f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.927343 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="813f04db-4c33-4db3-a81a-a5617d8d460f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.928584 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.932484 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x"] Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.933455 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.933619 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.933469 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:01:28 crc kubenswrapper[4886]: I0314 09:01:28.933690 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.110692 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmxj\" (UniqueName: \"kubernetes.io/projected/58ff1f53-8347-4a4f-9892-a8ba1d8822af-kube-api-access-rmmxj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.110787 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.112238 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.214949 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmxj\" (UniqueName: \"kubernetes.io/projected/58ff1f53-8347-4a4f-9892-a8ba1d8822af-kube-api-access-rmmxj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.215085 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.215118 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.220645 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.221586 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.232592 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmxj\" (UniqueName: \"kubernetes.io/projected/58ff1f53-8347-4a4f-9892-a8ba1d8822af-kube-api-access-rmmxj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rv55x\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.266007 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.821481 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x"] Mar 14 09:01:29 crc kubenswrapper[4886]: I0314 09:01:29.863516 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" event={"ID":"58ff1f53-8347-4a4f-9892-a8ba1d8822af","Type":"ContainerStarted","Data":"04b868d7e1dc3612ca90c57f8be23672e3004dfc507d0a67d9869dc6a8fa2412"} Mar 14 09:01:30 crc kubenswrapper[4886]: I0314 09:01:30.876762 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" event={"ID":"58ff1f53-8347-4a4f-9892-a8ba1d8822af","Type":"ContainerStarted","Data":"ef75db87953f72343153546b20983dc7dc3a202719ebda5b0b493c068bf065c2"} Mar 14 09:01:30 crc kubenswrapper[4886]: I0314 09:01:30.908354 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" podStartSLOduration=2.449194124 podStartE2EDuration="2.908332265s" podCreationTimestamp="2026-03-14 09:01:28 +0000 UTC" firstStartedPulling="2026-03-14 09:01:29.82227535 +0000 UTC m=+2025.070726987" lastFinishedPulling="2026-03-14 09:01:30.281413501 +0000 UTC m=+2025.529865128" observedRunningTime="2026-03-14 09:01:30.90043522 +0000 UTC m=+2026.148886857" watchObservedRunningTime="2026-03-14 09:01:30.908332265 +0000 UTC m=+2026.156783912" Mar 14 09:01:33 crc kubenswrapper[4886]: I0314 09:01:33.770079 4886 scope.go:117] "RemoveContainer" containerID="b757828f2baf86f4332302026e8b16e0764dbf5f974bf0e717e1a84d0de7ff1d" Mar 14 09:01:33 crc kubenswrapper[4886]: I0314 09:01:33.823394 4886 scope.go:117] "RemoveContainer" containerID="9850ec02f44acd2a8a2e5d333cff22d43a48ee54579e2e2cdfb4a18bb113bd22" Mar 14 09:01:33 crc kubenswrapper[4886]: I0314 09:01:33.896819 4886 scope.go:117] "RemoveContainer" containerID="b55b07edfc3c6db8f494ffa43bc412c5c283cecccaf42e59e5c1dfee3f60f580" Mar 14 09:01:33 crc kubenswrapper[4886]: I0314 09:01:33.956717 4886 scope.go:117] "RemoveContainer" containerID="d940b6b18724b28bd807f13bd9359d9b3e231166274fcedb906e68db0afdf259" Mar 14 09:01:34 crc kubenswrapper[4886]: I0314 09:01:34.010499 4886 scope.go:117] "RemoveContainer" containerID="05b71a3caa442f9cde91a7c9fa6129c19bb28eae6c5f2f5f65b9df2b28dbcb00" Mar 14 09:01:34 crc kubenswrapper[4886]: I0314 09:01:34.040432 4886 scope.go:117] "RemoveContainer" containerID="b6c76fd658a8b180e27bf1c63d1b42a0ed94a21db69a0aeaf7712c59dc01a4f6" Mar 14 09:01:40 crc kubenswrapper[4886]: I0314 09:01:40.054511 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6rrxl"] Mar 14 09:01:40 crc kubenswrapper[4886]: I0314 09:01:40.063774 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6rrxl"] Mar 14 09:01:41 crc kubenswrapper[4886]: I0314 09:01:41.436532 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18707ca6-9c35-482a-a186-da58e60b7540" path="/var/lib/kubelet/pods/18707ca6-9c35-482a-a186-da58e60b7540/volumes" Mar 14 09:01:56 crc kubenswrapper[4886]: I0314 09:01:56.066435 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:01:56 crc kubenswrapper[4886]: I0314 09:01:56.067140 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:01:58 crc kubenswrapper[4886]: I0314 09:01:58.049323 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pt4tz"] Mar 14 09:01:58 crc kubenswrapper[4886]: I0314 09:01:58.059259 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pt4tz"] Mar 14 09:01:59 crc kubenswrapper[4886]: I0314 09:01:59.443741 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e62d5f-645e-4ef7-adb0-bedd550ade7e" path="/var/lib/kubelet/pods/60e62d5f-645e-4ef7-adb0-bedd550ade7e/volumes" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.165088 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557982-gxdxv"] Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.167851 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.177331 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-gxdxv"] Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.220498 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.220559 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.220590 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.273935 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qm9\" (UniqueName: \"kubernetes.io/projected/fb9ef94d-9b0c-4f1d-a487-bcfc42336213-kube-api-access-48qm9\") pod \"auto-csr-approver-29557982-gxdxv\" (UID: \"fb9ef94d-9b0c-4f1d-a487-bcfc42336213\") " pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.377996 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qm9\" (UniqueName: \"kubernetes.io/projected/fb9ef94d-9b0c-4f1d-a487-bcfc42336213-kube-api-access-48qm9\") pod \"auto-csr-approver-29557982-gxdxv\" (UID: \"fb9ef94d-9b0c-4f1d-a487-bcfc42336213\") " pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.408849 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qm9\" (UniqueName: \"kubernetes.io/projected/fb9ef94d-9b0c-4f1d-a487-bcfc42336213-kube-api-access-48qm9\") pod \"auto-csr-approver-29557982-gxdxv\" (UID: \"fb9ef94d-9b0c-4f1d-a487-bcfc42336213\") " pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.534846 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:00 crc kubenswrapper[4886]: I0314 09:02:00.794332 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-gxdxv"] Mar 14 09:02:01 crc kubenswrapper[4886]: I0314 09:02:01.041408 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggzbn"] Mar 14 09:02:01 crc kubenswrapper[4886]: I0314 09:02:01.049958 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ggzbn"] Mar 14 09:02:01 crc kubenswrapper[4886]: I0314 09:02:01.233424 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" event={"ID":"fb9ef94d-9b0c-4f1d-a487-bcfc42336213","Type":"ContainerStarted","Data":"0c8bbaa179d25abd88401b7a0a7122f128eddea0921e79f8552114552db5b7fb"} Mar 14 09:02:01 crc kubenswrapper[4886]: I0314 09:02:01.436646 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0453f23e-955b-4cb7-8f57-285144677bc7" path="/var/lib/kubelet/pods/0453f23e-955b-4cb7-8f57-285144677bc7/volumes" Mar 14 09:02:02 crc kubenswrapper[4886]: I0314 09:02:02.245306 4886 generic.go:334] "Generic (PLEG): container finished" podID="fb9ef94d-9b0c-4f1d-a487-bcfc42336213" containerID="076e3be3787fc0beaf6adeee0ce5ed91176242689c6ff8a1a8f964319f058d76" exitCode=0 Mar 14 09:02:02 crc kubenswrapper[4886]: I0314 09:02:02.245382 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" event={"ID":"fb9ef94d-9b0c-4f1d-a487-bcfc42336213","Type":"ContainerDied","Data":"076e3be3787fc0beaf6adeee0ce5ed91176242689c6ff8a1a8f964319f058d76"} Mar 14 09:02:03 crc kubenswrapper[4886]: I0314 09:02:03.635884 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:03 crc kubenswrapper[4886]: I0314 09:02:03.756827 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48qm9\" (UniqueName: \"kubernetes.io/projected/fb9ef94d-9b0c-4f1d-a487-bcfc42336213-kube-api-access-48qm9\") pod \"fb9ef94d-9b0c-4f1d-a487-bcfc42336213\" (UID: \"fb9ef94d-9b0c-4f1d-a487-bcfc42336213\") " Mar 14 09:02:03 crc kubenswrapper[4886]: I0314 09:02:03.765286 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9ef94d-9b0c-4f1d-a487-bcfc42336213-kube-api-access-48qm9" (OuterVolumeSpecName: "kube-api-access-48qm9") pod "fb9ef94d-9b0c-4f1d-a487-bcfc42336213" (UID: "fb9ef94d-9b0c-4f1d-a487-bcfc42336213"). InnerVolumeSpecName "kube-api-access-48qm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:03 crc kubenswrapper[4886]: I0314 09:02:03.859844 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48qm9\" (UniqueName: \"kubernetes.io/projected/fb9ef94d-9b0c-4f1d-a487-bcfc42336213-kube-api-access-48qm9\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:04 crc kubenswrapper[4886]: I0314 09:02:04.271906 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" event={"ID":"fb9ef94d-9b0c-4f1d-a487-bcfc42336213","Type":"ContainerDied","Data":"0c8bbaa179d25abd88401b7a0a7122f128eddea0921e79f8552114552db5b7fb"} Mar 14 09:02:04 crc kubenswrapper[4886]: I0314 09:02:04.271980 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c8bbaa179d25abd88401b7a0a7122f128eddea0921e79f8552114552db5b7fb" Mar 14 09:02:04 crc kubenswrapper[4886]: I0314 09:02:04.272050 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557982-gxdxv" Mar 14 09:02:04 crc kubenswrapper[4886]: I0314 09:02:04.699592 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-lxpv6"] Mar 14 09:02:04 crc kubenswrapper[4886]: I0314 09:02:04.707854 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557976-lxpv6"] Mar 14 09:02:05 crc kubenswrapper[4886]: I0314 09:02:05.435399 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2" path="/var/lib/kubelet/pods/ea72f5a8-5ec3-4dfc-908e-a93f3cf6eae2/volumes" Mar 14 09:02:13 crc kubenswrapper[4886]: I0314 09:02:13.381338 4886 generic.go:334] "Generic (PLEG): container finished" podID="58ff1f53-8347-4a4f-9892-a8ba1d8822af" containerID="ef75db87953f72343153546b20983dc7dc3a202719ebda5b0b493c068bf065c2" exitCode=0 Mar 14 09:02:13 crc kubenswrapper[4886]: I0314 09:02:13.381416 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" event={"ID":"58ff1f53-8347-4a4f-9892-a8ba1d8822af","Type":"ContainerDied","Data":"ef75db87953f72343153546b20983dc7dc3a202719ebda5b0b493c068bf065c2"} Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.850080 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.934509 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-inventory\") pod \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.934572 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmmxj\" (UniqueName: \"kubernetes.io/projected/58ff1f53-8347-4a4f-9892-a8ba1d8822af-kube-api-access-rmmxj\") pod \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.934736 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-ssh-key-openstack-edpm-ipam\") pod \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\" (UID: \"58ff1f53-8347-4a4f-9892-a8ba1d8822af\") " Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.940755 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ff1f53-8347-4a4f-9892-a8ba1d8822af-kube-api-access-rmmxj" (OuterVolumeSpecName: "kube-api-access-rmmxj") pod "58ff1f53-8347-4a4f-9892-a8ba1d8822af" (UID: "58ff1f53-8347-4a4f-9892-a8ba1d8822af"). InnerVolumeSpecName "kube-api-access-rmmxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.963737 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58ff1f53-8347-4a4f-9892-a8ba1d8822af" (UID: "58ff1f53-8347-4a4f-9892-a8ba1d8822af"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:14 crc kubenswrapper[4886]: I0314 09:02:14.964481 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-inventory" (OuterVolumeSpecName: "inventory") pod "58ff1f53-8347-4a4f-9892-a8ba1d8822af" (UID: "58ff1f53-8347-4a4f-9892-a8ba1d8822af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.037863 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.037924 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmmxj\" (UniqueName: \"kubernetes.io/projected/58ff1f53-8347-4a4f-9892-a8ba1d8822af-kube-api-access-rmmxj\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.037952 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58ff1f53-8347-4a4f-9892-a8ba1d8822af-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.442590 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.444366 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rv55x" event={"ID":"58ff1f53-8347-4a4f-9892-a8ba1d8822af","Type":"ContainerDied","Data":"04b868d7e1dc3612ca90c57f8be23672e3004dfc507d0a67d9869dc6a8fa2412"} Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.444421 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04b868d7e1dc3612ca90c57f8be23672e3004dfc507d0a67d9869dc6a8fa2412" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.540464 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv"] Mar 14 09:02:15 crc kubenswrapper[4886]: E0314 09:02:15.541880 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ff1f53-8347-4a4f-9892-a8ba1d8822af" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.541920 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ff1f53-8347-4a4f-9892-a8ba1d8822af" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:02:15 crc kubenswrapper[4886]: E0314 09:02:15.542010 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9ef94d-9b0c-4f1d-a487-bcfc42336213" containerName="oc" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.542032 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9ef94d-9b0c-4f1d-a487-bcfc42336213" containerName="oc" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.542410 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9ef94d-9b0c-4f1d-a487-bcfc42336213" containerName="oc" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.542486 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ff1f53-8347-4a4f-9892-a8ba1d8822af" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.543925 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.546980 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.547305 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.547330 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.547481 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.558248 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv"] Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.659046 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.659177 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.660020 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c52wx\" (UniqueName: \"kubernetes.io/projected/b138089b-be7e-480e-8c8f-37104053a419-kube-api-access-c52wx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.763878 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c52wx\" (UniqueName: \"kubernetes.io/projected/b138089b-be7e-480e-8c8f-37104053a419-kube-api-access-c52wx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.764206 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.764260 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.769885 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.778235 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.791151 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c52wx\" (UniqueName: \"kubernetes.io/projected/b138089b-be7e-480e-8c8f-37104053a419-kube-api-access-c52wx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5bggv\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:15 crc kubenswrapper[4886]: I0314 09:02:15.865662 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:02:16 crc kubenswrapper[4886]: I0314 09:02:16.236218 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv"] Mar 14 09:02:16 crc kubenswrapper[4886]: I0314 09:02:16.436873 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" event={"ID":"b138089b-be7e-480e-8c8f-37104053a419","Type":"ContainerStarted","Data":"1c5aeff52b68fa26c29ebc82120b3904d3f39606b26f2ab500246050bdf7efc3"} Mar 14 09:02:17 crc kubenswrapper[4886]: I0314 09:02:17.453221 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" event={"ID":"b138089b-be7e-480e-8c8f-37104053a419","Type":"ContainerStarted","Data":"8abf551b23a5c5f6dd87ba729a25807a03e747e742f0bd8492b8cae8e39944eb"} Mar 14 09:02:17 crc kubenswrapper[4886]: I0314 09:02:17.488403 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" podStartSLOduration=2.069251253 podStartE2EDuration="2.488376913s" podCreationTimestamp="2026-03-14 09:02:15 +0000 UTC" firstStartedPulling="2026-03-14 09:02:16.240067472 +0000 UTC m=+2071.488519109" lastFinishedPulling="2026-03-14 09:02:16.659193132 +0000 UTC m=+2071.907644769" observedRunningTime="2026-03-14 09:02:17.479390657 +0000 UTC m=+2072.727842304" watchObservedRunningTime="2026-03-14 09:02:17.488376913 +0000 UTC m=+2072.736828550" Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.066232 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.067040 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.067087 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.067851 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d76383c767d4d7cb6a175fd55c92a6a7210c3d14dfdddef7efb953dd6c3ec5b"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.067910 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://8d76383c767d4d7cb6a175fd55c92a6a7210c3d14dfdddef7efb953dd6c3ec5b" gracePeriod=600 Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.561770 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="8d76383c767d4d7cb6a175fd55c92a6a7210c3d14dfdddef7efb953dd6c3ec5b" exitCode=0 Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.561875 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"8d76383c767d4d7cb6a175fd55c92a6a7210c3d14dfdddef7efb953dd6c3ec5b"} Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.562440 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551"} Mar 14 09:02:26 crc kubenswrapper[4886]: I0314 09:02:26.562507 4886 scope.go:117] "RemoveContainer" containerID="7ab2411d382aa74c9e39c272ecf1f4656fd25781ef375eab060b22a13c4415bc" Mar 14 09:02:34 crc kubenswrapper[4886]: I0314 09:02:34.204905 4886 scope.go:117] "RemoveContainer" containerID="ba18efa88eb6c20a44b891f4747f36acc1447f1ab84ac4cf7820a3240379480e" Mar 14 09:02:34 crc kubenswrapper[4886]: I0314 09:02:34.254401 4886 scope.go:117] "RemoveContainer" containerID="d486b5e38d7bdb2e2511e9e483fbb4cc13f693d0b87a9e43c033c303ef5053d2" Mar 14 09:02:34 crc kubenswrapper[4886]: I0314 09:02:34.305423 4886 scope.go:117] "RemoveContainer" containerID="c209e61eecdb4137304724149ddf492fa6c6d038b5cb91f572d1b065a6fde594" Mar 14 09:02:34 crc kubenswrapper[4886]: I0314 09:02:34.366955 4886 scope.go:117] "RemoveContainer" containerID="f111a63cf28cd315c5dadb37d0e511707539687b98e00cdd9df82cf9bb223ea3" Mar 14 09:02:43 crc kubenswrapper[4886]: I0314 09:02:43.042988 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-r5gnt"] Mar 14 09:02:43 crc kubenswrapper[4886]: I0314 09:02:43.051817 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-r5gnt"] Mar 14 09:02:43 crc kubenswrapper[4886]: I0314 09:02:43.435916 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392f2ed5-f494-4474-832a-a208bd72b1fa" path="/var/lib/kubelet/pods/392f2ed5-f494-4474-832a-a208bd72b1fa/volumes" Mar 14 09:03:12 crc kubenswrapper[4886]: I0314 09:03:12.138789 4886 generic.go:334] "Generic (PLEG): container finished" podID="b138089b-be7e-480e-8c8f-37104053a419" containerID="8abf551b23a5c5f6dd87ba729a25807a03e747e742f0bd8492b8cae8e39944eb" exitCode=0 Mar 14 09:03:12 crc kubenswrapper[4886]: I0314 09:03:12.138932 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" event={"ID":"b138089b-be7e-480e-8c8f-37104053a419","Type":"ContainerDied","Data":"8abf551b23a5c5f6dd87ba729a25807a03e747e742f0bd8492b8cae8e39944eb"} Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.617054 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.661293 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-inventory\") pod \"b138089b-be7e-480e-8c8f-37104053a419\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.661619 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-ssh-key-openstack-edpm-ipam\") pod \"b138089b-be7e-480e-8c8f-37104053a419\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.661685 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c52wx\" (UniqueName: \"kubernetes.io/projected/b138089b-be7e-480e-8c8f-37104053a419-kube-api-access-c52wx\") pod \"b138089b-be7e-480e-8c8f-37104053a419\" (UID: \"b138089b-be7e-480e-8c8f-37104053a419\") " Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.672346 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b138089b-be7e-480e-8c8f-37104053a419-kube-api-access-c52wx" (OuterVolumeSpecName: "kube-api-access-c52wx") pod "b138089b-be7e-480e-8c8f-37104053a419" (UID: "b138089b-be7e-480e-8c8f-37104053a419"). InnerVolumeSpecName "kube-api-access-c52wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.693069 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b138089b-be7e-480e-8c8f-37104053a419" (UID: "b138089b-be7e-480e-8c8f-37104053a419"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.699060 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-inventory" (OuterVolumeSpecName: "inventory") pod "b138089b-be7e-480e-8c8f-37104053a419" (UID: "b138089b-be7e-480e-8c8f-37104053a419"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.765041 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.765112 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c52wx\" (UniqueName: \"kubernetes.io/projected/b138089b-be7e-480e-8c8f-37104053a419-kube-api-access-c52wx\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:13 crc kubenswrapper[4886]: I0314 09:03:13.765162 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b138089b-be7e-480e-8c8f-37104053a419-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.159181 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" event={"ID":"b138089b-be7e-480e-8c8f-37104053a419","Type":"ContainerDied","Data":"1c5aeff52b68fa26c29ebc82120b3904d3f39606b26f2ab500246050bdf7efc3"} Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.159222 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c5aeff52b68fa26c29ebc82120b3904d3f39606b26f2ab500246050bdf7efc3" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.159277 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5bggv" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.284498 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qgkmx"] Mar 14 09:03:14 crc kubenswrapper[4886]: E0314 09:03:14.285199 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b138089b-be7e-480e-8c8f-37104053a419" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.285222 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b138089b-be7e-480e-8c8f-37104053a419" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.285491 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b138089b-be7e-480e-8c8f-37104053a419" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.286313 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.291332 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.296620 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.296686 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.297043 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.301645 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qgkmx"] Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.379565 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.379734 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.380319 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbl8\" (UniqueName: \"kubernetes.io/projected/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-kube-api-access-2zbl8\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.483714 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbl8\" (UniqueName: \"kubernetes.io/projected/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-kube-api-access-2zbl8\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.483793 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.483858 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.494878 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.494964 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.515939 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbl8\" (UniqueName: \"kubernetes.io/projected/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-kube-api-access-2zbl8\") pod \"ssh-known-hosts-edpm-deployment-qgkmx\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:14 crc kubenswrapper[4886]: I0314 09:03:14.607217 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:15 crc kubenswrapper[4886]: I0314 09:03:15.222362 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-qgkmx"] Mar 14 09:03:16 crc kubenswrapper[4886]: I0314 09:03:16.184389 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" event={"ID":"02fe85b9-cfc9-455d-ac92-ef1a3c16f729","Type":"ContainerStarted","Data":"a53a5edda18ab7d4111b1b41271e053db55e3127682aa4d2d25d4b42e83655fb"} Mar 14 09:03:16 crc kubenswrapper[4886]: I0314 09:03:16.184741 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" event={"ID":"02fe85b9-cfc9-455d-ac92-ef1a3c16f729","Type":"ContainerStarted","Data":"beef0de1e2ee25842d2012f87eac2321949a2205853da07a7280bd4908d3066a"} Mar 14 09:03:16 crc kubenswrapper[4886]: I0314 09:03:16.206362 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" podStartSLOduration=1.526403282 podStartE2EDuration="2.206344067s" podCreationTimestamp="2026-03-14 09:03:14 +0000 UTC" firstStartedPulling="2026-03-14 09:03:15.230625689 +0000 UTC m=+2130.479077336" lastFinishedPulling="2026-03-14 09:03:15.910566484 +0000 UTC m=+2131.159018121" observedRunningTime="2026-03-14 09:03:16.204809723 +0000 UTC m=+2131.453261360" watchObservedRunningTime="2026-03-14 09:03:16.206344067 +0000 UTC m=+2131.454795704" Mar 14 09:03:23 crc kubenswrapper[4886]: I0314 09:03:23.278010 4886 generic.go:334] "Generic (PLEG): container finished" podID="02fe85b9-cfc9-455d-ac92-ef1a3c16f729" containerID="a53a5edda18ab7d4111b1b41271e053db55e3127682aa4d2d25d4b42e83655fb" exitCode=0 Mar 14 09:03:23 crc kubenswrapper[4886]: I0314 09:03:23.278383 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" event={"ID":"02fe85b9-cfc9-455d-ac92-ef1a3c16f729","Type":"ContainerDied","Data":"a53a5edda18ab7d4111b1b41271e053db55e3127682aa4d2d25d4b42e83655fb"} Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.790687 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.847479 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zbl8\" (UniqueName: \"kubernetes.io/projected/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-kube-api-access-2zbl8\") pod \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.847577 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-ssh-key-openstack-edpm-ipam\") pod \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.847682 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-inventory-0\") pod \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\" (UID: \"02fe85b9-cfc9-455d-ac92-ef1a3c16f729\") " Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.854315 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-kube-api-access-2zbl8" (OuterVolumeSpecName: "kube-api-access-2zbl8") pod "02fe85b9-cfc9-455d-ac92-ef1a3c16f729" (UID: "02fe85b9-cfc9-455d-ac92-ef1a3c16f729"). InnerVolumeSpecName "kube-api-access-2zbl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.881208 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "02fe85b9-cfc9-455d-ac92-ef1a3c16f729" (UID: "02fe85b9-cfc9-455d-ac92-ef1a3c16f729"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.909489 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02fe85b9-cfc9-455d-ac92-ef1a3c16f729" (UID: "02fe85b9-cfc9-455d-ac92-ef1a3c16f729"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.950791 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zbl8\" (UniqueName: \"kubernetes.io/projected/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-kube-api-access-2zbl8\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.950839 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:24 crc kubenswrapper[4886]: I0314 09:03:24.950851 4886 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02fe85b9-cfc9-455d-ac92-ef1a3c16f729-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.307457 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" event={"ID":"02fe85b9-cfc9-455d-ac92-ef1a3c16f729","Type":"ContainerDied","Data":"beef0de1e2ee25842d2012f87eac2321949a2205853da07a7280bd4908d3066a"} Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.307759 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beef0de1e2ee25842d2012f87eac2321949a2205853da07a7280bd4908d3066a" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.307554 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-qgkmx" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.440334 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629"] Mar 14 09:03:25 crc kubenswrapper[4886]: E0314 09:03:25.440676 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02fe85b9-cfc9-455d-ac92-ef1a3c16f729" containerName="ssh-known-hosts-edpm-deployment" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.440692 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fe85b9-cfc9-455d-ac92-ef1a3c16f729" containerName="ssh-known-hosts-edpm-deployment" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.440876 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="02fe85b9-cfc9-455d-ac92-ef1a3c16f729" containerName="ssh-known-hosts-edpm-deployment" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.441581 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.443978 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629"] Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.449689 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.449874 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.450357 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.450463 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.570066 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.570161 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.570509 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h25p\" (UniqueName: \"kubernetes.io/projected/b1f2ef66-1863-484e-ba0e-d2ed4663453d-kube-api-access-5h25p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.671889 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h25p\" (UniqueName: \"kubernetes.io/projected/b1f2ef66-1863-484e-ba0e-d2ed4663453d-kube-api-access-5h25p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.672024 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.672064 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.677828 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.677915 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.690330 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h25p\" (UniqueName: \"kubernetes.io/projected/b1f2ef66-1863-484e-ba0e-d2ed4663453d-kube-api-access-5h25p\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sp629\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:25 crc kubenswrapper[4886]: I0314 09:03:25.767009 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:26 crc kubenswrapper[4886]: I0314 09:03:26.325951 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629"] Mar 14 09:03:26 crc kubenswrapper[4886]: I0314 09:03:26.337692 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:03:27 crc kubenswrapper[4886]: I0314 09:03:27.326985 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" event={"ID":"b1f2ef66-1863-484e-ba0e-d2ed4663453d","Type":"ContainerStarted","Data":"5065ab33d2c8dd104d2d01f8e956548a884830ea2b7ac093eaabf963ed116a6a"} Mar 14 09:03:27 crc kubenswrapper[4886]: I0314 09:03:27.327844 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" event={"ID":"b1f2ef66-1863-484e-ba0e-d2ed4663453d","Type":"ContainerStarted","Data":"f2d0421a07d65e4c4c401ae6c4a5ea506b2c1127f939522eb7fea4dd95e86993"} Mar 14 09:03:27 crc kubenswrapper[4886]: I0314 09:03:27.362571 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" podStartSLOduration=1.906157619 podStartE2EDuration="2.362547337s" podCreationTimestamp="2026-03-14 09:03:25 +0000 UTC" firstStartedPulling="2026-03-14 09:03:26.337498003 +0000 UTC m=+2141.585949640" lastFinishedPulling="2026-03-14 09:03:26.793887721 +0000 UTC m=+2142.042339358" observedRunningTime="2026-03-14 09:03:27.351654491 +0000 UTC m=+2142.600106138" watchObservedRunningTime="2026-03-14 09:03:27.362547337 +0000 UTC m=+2142.610998974" Mar 14 09:03:34 crc kubenswrapper[4886]: I0314 09:03:34.504357 4886 scope.go:117] "RemoveContainer" containerID="5a080999a098b752548f15b262235bd5172c50585c336f8b1d3a19c433046030" Mar 14 09:03:36 crc kubenswrapper[4886]: I0314 09:03:36.416189 4886 generic.go:334] "Generic (PLEG): container finished" podID="b1f2ef66-1863-484e-ba0e-d2ed4663453d" containerID="5065ab33d2c8dd104d2d01f8e956548a884830ea2b7ac093eaabf963ed116a6a" exitCode=0 Mar 14 09:03:36 crc kubenswrapper[4886]: I0314 09:03:36.416242 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" event={"ID":"b1f2ef66-1863-484e-ba0e-d2ed4663453d","Type":"ContainerDied","Data":"5065ab33d2c8dd104d2d01f8e956548a884830ea2b7ac093eaabf963ed116a6a"} Mar 14 09:03:37 crc kubenswrapper[4886]: I0314 09:03:37.890899 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.055777 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h25p\" (UniqueName: \"kubernetes.io/projected/b1f2ef66-1863-484e-ba0e-d2ed4663453d-kube-api-access-5h25p\") pod \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.056225 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-ssh-key-openstack-edpm-ipam\") pod \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.056400 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-inventory\") pod \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\" (UID: \"b1f2ef66-1863-484e-ba0e-d2ed4663453d\") " Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.066024 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f2ef66-1863-484e-ba0e-d2ed4663453d-kube-api-access-5h25p" (OuterVolumeSpecName: "kube-api-access-5h25p") pod "b1f2ef66-1863-484e-ba0e-d2ed4663453d" (UID: "b1f2ef66-1863-484e-ba0e-d2ed4663453d"). InnerVolumeSpecName "kube-api-access-5h25p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.086700 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-inventory" (OuterVolumeSpecName: "inventory") pod "b1f2ef66-1863-484e-ba0e-d2ed4663453d" (UID: "b1f2ef66-1863-484e-ba0e-d2ed4663453d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.088692 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1f2ef66-1863-484e-ba0e-d2ed4663453d" (UID: "b1f2ef66-1863-484e-ba0e-d2ed4663453d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.159561 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h25p\" (UniqueName: \"kubernetes.io/projected/b1f2ef66-1863-484e-ba0e-d2ed4663453d-kube-api-access-5h25p\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.159617 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.159633 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f2ef66-1863-484e-ba0e-d2ed4663453d-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.446887 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" event={"ID":"b1f2ef66-1863-484e-ba0e-d2ed4663453d","Type":"ContainerDied","Data":"f2d0421a07d65e4c4c401ae6c4a5ea506b2c1127f939522eb7fea4dd95e86993"} Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.446956 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d0421a07d65e4c4c401ae6c4a5ea506b2c1127f939522eb7fea4dd95e86993" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.446970 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sp629" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.549196 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck"] Mar 14 09:03:38 crc kubenswrapper[4886]: E0314 09:03:38.549865 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f2ef66-1863-484e-ba0e-d2ed4663453d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.549888 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f2ef66-1863-484e-ba0e-d2ed4663453d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.550144 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f2ef66-1863-484e-ba0e-d2ed4663453d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.551179 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.554008 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.554683 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.554927 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.556640 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.565138 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck"] Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.674961 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.675020 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4n6j\" (UniqueName: \"kubernetes.io/projected/b0b3b087-69b5-4953-a239-aade9af83aaa-kube-api-access-x4n6j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.675152 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.777106 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.777168 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4n6j\" (UniqueName: \"kubernetes.io/projected/b0b3b087-69b5-4953-a239-aade9af83aaa-kube-api-access-x4n6j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.777254 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.781429 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.781697 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.808762 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4n6j\" (UniqueName: \"kubernetes.io/projected/b0b3b087-69b5-4953-a239-aade9af83aaa-kube-api-access-x4n6j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:38 crc kubenswrapper[4886]: I0314 09:03:38.872183 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:39 crc kubenswrapper[4886]: W0314 09:03:39.394100 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0b3b087_69b5_4953_a239_aade9af83aaa.slice/crio-7011ee2562297ba6140a6fc2537cea4b101f203af1053a04f32e9229c22a5de9 WatchSource:0}: Error finding container 7011ee2562297ba6140a6fc2537cea4b101f203af1053a04f32e9229c22a5de9: Status 404 returned error can't find the container with id 7011ee2562297ba6140a6fc2537cea4b101f203af1053a04f32e9229c22a5de9 Mar 14 09:03:39 crc kubenswrapper[4886]: I0314 09:03:39.394682 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck"] Mar 14 09:03:39 crc kubenswrapper[4886]: I0314 09:03:39.459590 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" event={"ID":"b0b3b087-69b5-4953-a239-aade9af83aaa","Type":"ContainerStarted","Data":"7011ee2562297ba6140a6fc2537cea4b101f203af1053a04f32e9229c22a5de9"} Mar 14 09:03:40 crc kubenswrapper[4886]: I0314 09:03:40.472222 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" event={"ID":"b0b3b087-69b5-4953-a239-aade9af83aaa","Type":"ContainerStarted","Data":"139c261938d81769dcf6a5de499708162f30347c8b049adb80cb655e57e75bc7"} Mar 14 09:03:40 crc kubenswrapper[4886]: I0314 09:03:40.504498 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" podStartSLOduration=2.040890444 podStartE2EDuration="2.504466664s" podCreationTimestamp="2026-03-14 09:03:38 +0000 UTC" firstStartedPulling="2026-03-14 09:03:39.396519429 +0000 UTC m=+2154.644971066" lastFinishedPulling="2026-03-14 09:03:39.860095649 +0000 UTC m=+2155.108547286" observedRunningTime="2026-03-14 09:03:40.496738447 +0000 UTC m=+2155.745190084" watchObservedRunningTime="2026-03-14 09:03:40.504466664 +0000 UTC m=+2155.752918301" Mar 14 09:03:50 crc kubenswrapper[4886]: I0314 09:03:50.576012 4886 generic.go:334] "Generic (PLEG): container finished" podID="b0b3b087-69b5-4953-a239-aade9af83aaa" containerID="139c261938d81769dcf6a5de499708162f30347c8b049adb80cb655e57e75bc7" exitCode=0 Mar 14 09:03:50 crc kubenswrapper[4886]: I0314 09:03:50.576148 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" event={"ID":"b0b3b087-69b5-4953-a239-aade9af83aaa","Type":"ContainerDied","Data":"139c261938d81769dcf6a5de499708162f30347c8b049adb80cb655e57e75bc7"} Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.031989 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.186894 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-ssh-key-openstack-edpm-ipam\") pod \"b0b3b087-69b5-4953-a239-aade9af83aaa\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.187032 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4n6j\" (UniqueName: \"kubernetes.io/projected/b0b3b087-69b5-4953-a239-aade9af83aaa-kube-api-access-x4n6j\") pod \"b0b3b087-69b5-4953-a239-aade9af83aaa\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.187051 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-inventory\") pod \"b0b3b087-69b5-4953-a239-aade9af83aaa\" (UID: \"b0b3b087-69b5-4953-a239-aade9af83aaa\") " Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.193573 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b3b087-69b5-4953-a239-aade9af83aaa-kube-api-access-x4n6j" (OuterVolumeSpecName: "kube-api-access-x4n6j") pod "b0b3b087-69b5-4953-a239-aade9af83aaa" (UID: "b0b3b087-69b5-4953-a239-aade9af83aaa"). InnerVolumeSpecName "kube-api-access-x4n6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.215659 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-inventory" (OuterVolumeSpecName: "inventory") pod "b0b3b087-69b5-4953-a239-aade9af83aaa" (UID: "b0b3b087-69b5-4953-a239-aade9af83aaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.216020 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0b3b087-69b5-4953-a239-aade9af83aaa" (UID: "b0b3b087-69b5-4953-a239-aade9af83aaa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.290041 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.290071 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4n6j\" (UniqueName: \"kubernetes.io/projected/b0b3b087-69b5-4953-a239-aade9af83aaa-kube-api-access-x4n6j\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.290080 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0b3b087-69b5-4953-a239-aade9af83aaa-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.595416 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" event={"ID":"b0b3b087-69b5-4953-a239-aade9af83aaa","Type":"ContainerDied","Data":"7011ee2562297ba6140a6fc2537cea4b101f203af1053a04f32e9229c22a5de9"} Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.595805 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7011ee2562297ba6140a6fc2537cea4b101f203af1053a04f32e9229c22a5de9" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.595467 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.682472 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm"] Mar 14 09:03:52 crc kubenswrapper[4886]: E0314 09:03:52.683054 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b3b087-69b5-4953-a239-aade9af83aaa" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.683084 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b3b087-69b5-4953-a239-aade9af83aaa" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.683357 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b3b087-69b5-4953-a239-aade9af83aaa" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.684257 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693172 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693389 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693497 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693515 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693714 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693850 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.693988 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.694066 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.700676 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm"] Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.801563 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.801621 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.801653 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng49l\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-kube-api-access-ng49l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.801911 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802041 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802166 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802228 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802282 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802333 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802382 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802420 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802448 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.802562 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904685 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904737 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904759 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904801 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904822 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904843 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904870 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904896 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904917 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904934 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.904968 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.905033 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.905054 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.905070 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng49l\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-kube-api-access-ng49l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.918977 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.935226 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.939063 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.939832 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.940417 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.940550 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.940988 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.941135 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.941413 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.942808 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.946696 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng49l\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-kube-api-access-ng49l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.946795 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.946927 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:52 crc kubenswrapper[4886]: I0314 09:03:52.950195 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:53 crc kubenswrapper[4886]: I0314 09:03:53.004981 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:03:53 crc kubenswrapper[4886]: I0314 09:03:53.565876 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm"] Mar 14 09:03:53 crc kubenswrapper[4886]: I0314 09:03:53.605891 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" event={"ID":"49b11e76-2a25-43aa-a8ff-a383088da9b5","Type":"ContainerStarted","Data":"2561035e4423be1cead90009e545251caca223685d466a38dd4229da9e30b8b4"} Mar 14 09:03:54 crc kubenswrapper[4886]: I0314 09:03:54.617315 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" event={"ID":"49b11e76-2a25-43aa-a8ff-a383088da9b5","Type":"ContainerStarted","Data":"983c45956ca91730166399390a9d91e304d523b5bdd4b4a155079958b70c1c29"} Mar 14 09:03:54 crc kubenswrapper[4886]: I0314 09:03:54.640501 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" podStartSLOduration=2.152971951 podStartE2EDuration="2.640483745s" podCreationTimestamp="2026-03-14 09:03:52 +0000 UTC" firstStartedPulling="2026-03-14 09:03:53.572046521 +0000 UTC m=+2168.820498148" lastFinishedPulling="2026-03-14 09:03:54.059558305 +0000 UTC m=+2169.308009942" observedRunningTime="2026-03-14 09:03:54.635875496 +0000 UTC m=+2169.884327133" watchObservedRunningTime="2026-03-14 09:03:54.640483745 +0000 UTC m=+2169.888935382" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.132905 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557984-crnpd"] Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.135032 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.138564 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.139679 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.144617 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-crnpd"] Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.148525 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.265214 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txsz\" (UniqueName: \"kubernetes.io/projected/9c36cb28-29b5-4c22-8021-6bc4148c11a8-kube-api-access-4txsz\") pod \"auto-csr-approver-29557984-crnpd\" (UID: \"9c36cb28-29b5-4c22-8021-6bc4148c11a8\") " pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.367467 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4txsz\" (UniqueName: \"kubernetes.io/projected/9c36cb28-29b5-4c22-8021-6bc4148c11a8-kube-api-access-4txsz\") pod \"auto-csr-approver-29557984-crnpd\" (UID: \"9c36cb28-29b5-4c22-8021-6bc4148c11a8\") " pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.400518 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txsz\" (UniqueName: \"kubernetes.io/projected/9c36cb28-29b5-4c22-8021-6bc4148c11a8-kube-api-access-4txsz\") pod \"auto-csr-approver-29557984-crnpd\" (UID: \"9c36cb28-29b5-4c22-8021-6bc4148c11a8\") " pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.460492 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:00 crc kubenswrapper[4886]: I0314 09:04:00.949782 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-crnpd"] Mar 14 09:04:01 crc kubenswrapper[4886]: I0314 09:04:01.696074 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-crnpd" event={"ID":"9c36cb28-29b5-4c22-8021-6bc4148c11a8","Type":"ContainerStarted","Data":"0463e58501327d543b6522f0e62eb151cdfc2c0efe998d6d6e35ee87fc3b6f05"} Mar 14 09:04:02 crc kubenswrapper[4886]: E0314 09:04:02.542034 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c36cb28_29b5_4c22_8021_6bc4148c11a8.slice/crio-conmon-a6f0e5ba8d45d865cf87859862dfbece3b5bbbf921378f3c69d199164f500103.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:04:02 crc kubenswrapper[4886]: I0314 09:04:02.709847 4886 generic.go:334] "Generic (PLEG): container finished" podID="9c36cb28-29b5-4c22-8021-6bc4148c11a8" containerID="a6f0e5ba8d45d865cf87859862dfbece3b5bbbf921378f3c69d199164f500103" exitCode=0 Mar 14 09:04:02 crc kubenswrapper[4886]: I0314 09:04:02.710103 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-crnpd" event={"ID":"9c36cb28-29b5-4c22-8021-6bc4148c11a8","Type":"ContainerDied","Data":"a6f0e5ba8d45d865cf87859862dfbece3b5bbbf921378f3c69d199164f500103"} Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.072845 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.161272 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4txsz\" (UniqueName: \"kubernetes.io/projected/9c36cb28-29b5-4c22-8021-6bc4148c11a8-kube-api-access-4txsz\") pod \"9c36cb28-29b5-4c22-8021-6bc4148c11a8\" (UID: \"9c36cb28-29b5-4c22-8021-6bc4148c11a8\") " Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.169905 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c36cb28-29b5-4c22-8021-6bc4148c11a8-kube-api-access-4txsz" (OuterVolumeSpecName: "kube-api-access-4txsz") pod "9c36cb28-29b5-4c22-8021-6bc4148c11a8" (UID: "9c36cb28-29b5-4c22-8021-6bc4148c11a8"). InnerVolumeSpecName "kube-api-access-4txsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.264981 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4txsz\" (UniqueName: \"kubernetes.io/projected/9c36cb28-29b5-4c22-8021-6bc4148c11a8-kube-api-access-4txsz\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.729357 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557984-crnpd" event={"ID":"9c36cb28-29b5-4c22-8021-6bc4148c11a8","Type":"ContainerDied","Data":"0463e58501327d543b6522f0e62eb151cdfc2c0efe998d6d6e35ee87fc3b6f05"} Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.729400 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0463e58501327d543b6522f0e62eb151cdfc2c0efe998d6d6e35ee87fc3b6f05" Mar 14 09:04:04 crc kubenswrapper[4886]: I0314 09:04:04.729448 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557984-crnpd" Mar 14 09:04:05 crc kubenswrapper[4886]: I0314 09:04:05.149971 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7sb9n"] Mar 14 09:04:05 crc kubenswrapper[4886]: I0314 09:04:05.158513 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557978-7sb9n"] Mar 14 09:04:05 crc kubenswrapper[4886]: I0314 09:04:05.433155 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c73e07-5b8d-47b4-8c11-2f37825df8fe" path="/var/lib/kubelet/pods/d3c73e07-5b8d-47b4-8c11-2f37825df8fe/volumes" Mar 14 09:04:26 crc kubenswrapper[4886]: I0314 09:04:26.066757 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:04:26 crc kubenswrapper[4886]: I0314 09:04:26.067895 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:04:34 crc kubenswrapper[4886]: I0314 09:04:34.598930 4886 scope.go:117] "RemoveContainer" containerID="8364b1b519be961628d3418d52599a0032331ed0b646fd02bd6b8b299c30b401" Mar 14 09:04:38 crc kubenswrapper[4886]: I0314 09:04:38.047265 4886 generic.go:334] "Generic (PLEG): container finished" podID="49b11e76-2a25-43aa-a8ff-a383088da9b5" containerID="983c45956ca91730166399390a9d91e304d523b5bdd4b4a155079958b70c1c29" exitCode=0 Mar 14 09:04:38 crc kubenswrapper[4886]: I0314 09:04:38.047336 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" event={"ID":"49b11e76-2a25-43aa-a8ff-a383088da9b5","Type":"ContainerDied","Data":"983c45956ca91730166399390a9d91e304d523b5bdd4b4a155079958b70c1c29"} Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.546951 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.635335 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ovn-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.635759 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-inventory\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.635807 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.635863 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.635927 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-libvirt-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636457 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636621 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-nova-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636647 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-telemetry-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636693 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-repo-setup-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636722 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ssh-key-openstack-edpm-ipam\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636760 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng49l\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-kube-api-access-ng49l\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636782 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-bootstrap-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636801 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.636853 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-neutron-metadata-combined-ca-bundle\") pod \"49b11e76-2a25-43aa-a8ff-a383088da9b5\" (UID: \"49b11e76-2a25-43aa-a8ff-a383088da9b5\") " Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.643367 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.644371 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.644797 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.645457 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-kube-api-access-ng49l" (OuterVolumeSpecName: "kube-api-access-ng49l") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "kube-api-access-ng49l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.645494 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.645948 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.647927 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.648398 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.648776 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.649666 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.650598 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.656291 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.675000 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-inventory" (OuterVolumeSpecName: "inventory") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.685627 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "49b11e76-2a25-43aa-a8ff-a383088da9b5" (UID: "49b11e76-2a25-43aa-a8ff-a383088da9b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739067 4886 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739100 4886 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739111 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739143 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739155 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739168 4886 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739177 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739185 4886 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739194 4886 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739203 4886 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739211 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739220 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng49l\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-kube-api-access-ng49l\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739227 4886 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b11e76-2a25-43aa-a8ff-a383088da9b5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:39 crc kubenswrapper[4886]: I0314 09:04:39.739236 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/49b11e76-2a25-43aa-a8ff-a383088da9b5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.073741 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" event={"ID":"49b11e76-2a25-43aa-a8ff-a383088da9b5","Type":"ContainerDied","Data":"2561035e4423be1cead90009e545251caca223685d466a38dd4229da9e30b8b4"} Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.073805 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2561035e4423be1cead90009e545251caca223685d466a38dd4229da9e30b8b4" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.073818 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.233629 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz"] Mar 14 09:04:40 crc kubenswrapper[4886]: E0314 09:04:40.234168 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c36cb28-29b5-4c22-8021-6bc4148c11a8" containerName="oc" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.234187 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c36cb28-29b5-4c22-8021-6bc4148c11a8" containerName="oc" Mar 14 09:04:40 crc kubenswrapper[4886]: E0314 09:04:40.234204 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b11e76-2a25-43aa-a8ff-a383088da9b5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.234212 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b11e76-2a25-43aa-a8ff-a383088da9b5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.234627 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b11e76-2a25-43aa-a8ff-a383088da9b5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.234649 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c36cb28-29b5-4c22-8021-6bc4148c11a8" containerName="oc" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.235419 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.246666 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.246741 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.247303 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.247429 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.247883 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.250689 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz"] Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.253785 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.253871 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.254097 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.254178 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.254223 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfkg\" (UniqueName: \"kubernetes.io/projected/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-kube-api-access-kpfkg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.355507 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.355568 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.355667 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.355686 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.355705 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfkg\" (UniqueName: \"kubernetes.io/projected/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-kube-api-access-kpfkg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.357094 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.360723 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.362767 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.365059 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.372514 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfkg\" (UniqueName: \"kubernetes.io/projected/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-kube-api-access-kpfkg\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wklpz\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:40 crc kubenswrapper[4886]: I0314 09:04:40.566730 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:04:41 crc kubenswrapper[4886]: I0314 09:04:41.208332 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz"] Mar 14 09:04:42 crc kubenswrapper[4886]: I0314 09:04:42.103250 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" event={"ID":"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8","Type":"ContainerStarted","Data":"200911041e7a42fdcb81bde228a19a09feb9bb2039126f4a2bcf91963b469291"} Mar 14 09:04:43 crc kubenswrapper[4886]: I0314 09:04:43.119289 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" event={"ID":"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8","Type":"ContainerStarted","Data":"ff781b4f0e8dce5af6fa83a7f3d5b7528f4860a2a2331704d6010abd6553abd9"} Mar 14 09:04:43 crc kubenswrapper[4886]: I0314 09:04:43.141660 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" podStartSLOduration=2.370158524 podStartE2EDuration="3.141637845s" podCreationTimestamp="2026-03-14 09:04:40 +0000 UTC" firstStartedPulling="2026-03-14 09:04:41.212910851 +0000 UTC m=+2216.461362498" lastFinishedPulling="2026-03-14 09:04:41.984390182 +0000 UTC m=+2217.232841819" observedRunningTime="2026-03-14 09:04:43.133027183 +0000 UTC m=+2218.381478820" watchObservedRunningTime="2026-03-14 09:04:43.141637845 +0000 UTC m=+2218.390089482" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.150335 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bb4sp"] Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.153154 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.159689 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bb4sp"] Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.184549 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-catalog-content\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.184732 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2d6\" (UniqueName: \"kubernetes.io/projected/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-kube-api-access-vk2d6\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.184829 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-utilities\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.287824 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2d6\" (UniqueName: \"kubernetes.io/projected/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-kube-api-access-vk2d6\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.287980 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-utilities\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.288059 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-catalog-content\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.288661 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-utilities\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.288680 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-catalog-content\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.309090 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2d6\" (UniqueName: \"kubernetes.io/projected/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-kube-api-access-vk2d6\") pod \"certified-operators-bb4sp\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:45 crc kubenswrapper[4886]: I0314 09:04:45.489991 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:46 crc kubenswrapper[4886]: I0314 09:04:46.004364 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bb4sp"] Mar 14 09:04:46 crc kubenswrapper[4886]: W0314 09:04:46.010629 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f336d9_f8ca_4a0a_ab3f_a23e30c94590.slice/crio-131cc4b88a33859e700b3a7588e1c706867123eea7f21d6b9959e7a34bbcae20 WatchSource:0}: Error finding container 131cc4b88a33859e700b3a7588e1c706867123eea7f21d6b9959e7a34bbcae20: Status 404 returned error can't find the container with id 131cc4b88a33859e700b3a7588e1c706867123eea7f21d6b9959e7a34bbcae20 Mar 14 09:04:46 crc kubenswrapper[4886]: I0314 09:04:46.149011 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerStarted","Data":"131cc4b88a33859e700b3a7588e1c706867123eea7f21d6b9959e7a34bbcae20"} Mar 14 09:04:47 crc kubenswrapper[4886]: I0314 09:04:47.161577 4886 generic.go:334] "Generic (PLEG): container finished" podID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerID="5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151" exitCode=0 Mar 14 09:04:47 crc kubenswrapper[4886]: I0314 09:04:47.161675 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerDied","Data":"5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151"} Mar 14 09:04:48 crc kubenswrapper[4886]: I0314 09:04:48.175809 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerStarted","Data":"e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1"} Mar 14 09:04:49 crc kubenswrapper[4886]: I0314 09:04:49.198329 4886 generic.go:334] "Generic (PLEG): container finished" podID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerID="e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1" exitCode=0 Mar 14 09:04:49 crc kubenswrapper[4886]: I0314 09:04:49.200530 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerDied","Data":"e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1"} Mar 14 09:04:50 crc kubenswrapper[4886]: I0314 09:04:50.211838 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerStarted","Data":"0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d"} Mar 14 09:04:50 crc kubenswrapper[4886]: I0314 09:04:50.245509 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bb4sp" podStartSLOduration=2.758115765 podStartE2EDuration="5.245491624s" podCreationTimestamp="2026-03-14 09:04:45 +0000 UTC" firstStartedPulling="2026-03-14 09:04:47.165072073 +0000 UTC m=+2222.413523720" lastFinishedPulling="2026-03-14 09:04:49.652447942 +0000 UTC m=+2224.900899579" observedRunningTime="2026-03-14 09:04:50.239317961 +0000 UTC m=+2225.487769618" watchObservedRunningTime="2026-03-14 09:04:50.245491624 +0000 UTC m=+2225.493943261" Mar 14 09:04:55 crc kubenswrapper[4886]: I0314 09:04:55.491009 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:55 crc kubenswrapper[4886]: I0314 09:04:55.491548 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:55 crc kubenswrapper[4886]: I0314 09:04:55.543431 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:56 crc kubenswrapper[4886]: I0314 09:04:56.066698 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:04:56 crc kubenswrapper[4886]: I0314 09:04:56.067095 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:04:56 crc kubenswrapper[4886]: I0314 09:04:56.308593 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:56 crc kubenswrapper[4886]: I0314 09:04:56.357580 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bb4sp"] Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.280038 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bb4sp" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="registry-server" containerID="cri-o://0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d" gracePeriod=2 Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.755072 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.822147 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-utilities\") pod \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.822242 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-catalog-content\") pod \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.822462 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk2d6\" (UniqueName: \"kubernetes.io/projected/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-kube-api-access-vk2d6\") pod \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\" (UID: \"25f336d9-f8ca-4a0a-ab3f-a23e30c94590\") " Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.825893 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-utilities" (OuterVolumeSpecName: "utilities") pod "25f336d9-f8ca-4a0a-ab3f-a23e30c94590" (UID: "25f336d9-f8ca-4a0a-ab3f-a23e30c94590"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.830570 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-kube-api-access-vk2d6" (OuterVolumeSpecName: "kube-api-access-vk2d6") pod "25f336d9-f8ca-4a0a-ab3f-a23e30c94590" (UID: "25f336d9-f8ca-4a0a-ab3f-a23e30c94590"). InnerVolumeSpecName "kube-api-access-vk2d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.871162 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25f336d9-f8ca-4a0a-ab3f-a23e30c94590" (UID: "25f336d9-f8ca-4a0a-ab3f-a23e30c94590"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.925458 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk2d6\" (UniqueName: \"kubernetes.io/projected/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-kube-api-access-vk2d6\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.925500 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:58 crc kubenswrapper[4886]: I0314 09:04:58.925513 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f336d9-f8ca-4a0a-ab3f-a23e30c94590-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.294618 4886 generic.go:334] "Generic (PLEG): container finished" podID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerID="0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d" exitCode=0 Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.294689 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerDied","Data":"0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d"} Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.294734 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb4sp" event={"ID":"25f336d9-f8ca-4a0a-ab3f-a23e30c94590","Type":"ContainerDied","Data":"131cc4b88a33859e700b3a7588e1c706867123eea7f21d6b9959e7a34bbcae20"} Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.294745 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb4sp" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.294764 4886 scope.go:117] "RemoveContainer" containerID="0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.331463 4886 scope.go:117] "RemoveContainer" containerID="e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.364456 4886 scope.go:117] "RemoveContainer" containerID="5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.371533 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bb4sp"] Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.382790 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bb4sp"] Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.415962 4886 scope.go:117] "RemoveContainer" containerID="0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d" Mar 14 09:04:59 crc kubenswrapper[4886]: E0314 09:04:59.416515 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d\": container with ID starting with 0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d not found: ID does not exist" containerID="0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.416564 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d"} err="failed to get container status \"0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d\": rpc error: code = NotFound desc = could not find container \"0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d\": container with ID starting with 0b63c37b87fa4633d19101cae625f152516487d4ad2b4e06266273cad37a752d not found: ID does not exist" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.416593 4886 scope.go:117] "RemoveContainer" containerID="e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1" Mar 14 09:04:59 crc kubenswrapper[4886]: E0314 09:04:59.417006 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1\": container with ID starting with e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1 not found: ID does not exist" containerID="e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.417031 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1"} err="failed to get container status \"e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1\": rpc error: code = NotFound desc = could not find container \"e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1\": container with ID starting with e1ee5d10c32b91458fcb8207afc0c040f0e07435c211987ed220eadee886e9a1 not found: ID does not exist" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.417051 4886 scope.go:117] "RemoveContainer" containerID="5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151" Mar 14 09:04:59 crc kubenswrapper[4886]: E0314 09:04:59.417298 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151\": container with ID starting with 5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151 not found: ID does not exist" containerID="5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.417325 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151"} err="failed to get container status \"5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151\": rpc error: code = NotFound desc = could not find container \"5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151\": container with ID starting with 5530bd9ecff6ae32316e09da4a9d327c8702796a12b4a255dde928eba6e49151 not found: ID does not exist" Mar 14 09:04:59 crc kubenswrapper[4886]: I0314 09:04:59.438684 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" path="/var/lib/kubelet/pods/25f336d9-f8ca-4a0a-ab3f-a23e30c94590/volumes" Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.066148 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.066794 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.066858 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.067688 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.067760 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" gracePeriod=600 Mar 14 09:05:26 crc kubenswrapper[4886]: E0314 09:05:26.199913 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.564322 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" exitCode=0 Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.564382 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551"} Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.564429 4886 scope.go:117] "RemoveContainer" containerID="8d76383c767d4d7cb6a175fd55c92a6a7210c3d14dfdddef7efb953dd6c3ec5b" Mar 14 09:05:26 crc kubenswrapper[4886]: I0314 09:05:26.565100 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:05:26 crc kubenswrapper[4886]: E0314 09:05:26.565454 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:05:38 crc kubenswrapper[4886]: I0314 09:05:38.421502 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:05:38 crc kubenswrapper[4886]: E0314 09:05:38.422224 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:05:49 crc kubenswrapper[4886]: I0314 09:05:49.787967 4886 generic.go:334] "Generic (PLEG): container finished" podID="f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" containerID="ff781b4f0e8dce5af6fa83a7f3d5b7528f4860a2a2331704d6010abd6553abd9" exitCode=0 Mar 14 09:05:49 crc kubenswrapper[4886]: I0314 09:05:49.788211 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" event={"ID":"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8","Type":"ContainerDied","Data":"ff781b4f0e8dce5af6fa83a7f3d5b7528f4860a2a2331704d6010abd6553abd9"} Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.281781 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.343950 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovncontroller-config-0\") pod \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.344006 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpfkg\" (UniqueName: \"kubernetes.io/projected/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-kube-api-access-kpfkg\") pod \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.344044 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ssh-key-openstack-edpm-ipam\") pod \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.344385 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovn-combined-ca-bundle\") pod \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.345039 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-inventory\") pod \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\" (UID: \"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8\") " Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.349911 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" (UID: "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.349959 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-kube-api-access-kpfkg" (OuterVolumeSpecName: "kube-api-access-kpfkg") pod "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" (UID: "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8"). InnerVolumeSpecName "kube-api-access-kpfkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.370940 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" (UID: "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.374467 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" (UID: "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.374580 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-inventory" (OuterVolumeSpecName: "inventory") pod "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" (UID: "f5062e4f-08e6-4fb3-b5f5-9938dd8633e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.421499 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:05:51 crc kubenswrapper[4886]: E0314 09:05:51.422096 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.447239 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.447410 4886 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.447423 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpfkg\" (UniqueName: \"kubernetes.io/projected/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-kube-api-access-kpfkg\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.447431 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.447440 4886 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5062e4f-08e6-4fb3-b5f5-9938dd8633e8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.815595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" event={"ID":"f5062e4f-08e6-4fb3-b5f5-9938dd8633e8","Type":"ContainerDied","Data":"200911041e7a42fdcb81bde228a19a09feb9bb2039126f4a2bcf91963b469291"} Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.815647 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="200911041e7a42fdcb81bde228a19a09feb9bb2039126f4a2bcf91963b469291" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.815724 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wklpz" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.914887 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx"] Mar 14 09:05:51 crc kubenswrapper[4886]: E0314 09:05:51.915764 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="extract-utilities" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.915785 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="extract-utilities" Mar 14 09:05:51 crc kubenswrapper[4886]: E0314 09:05:51.915821 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.915832 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 14 09:05:51 crc kubenswrapper[4886]: E0314 09:05:51.915880 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="extract-content" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.915891 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="extract-content" Mar 14 09:05:51 crc kubenswrapper[4886]: E0314 09:05:51.915911 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="registry-server" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.915921 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="registry-server" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.916189 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5062e4f-08e6-4fb3-b5f5-9938dd8633e8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.916203 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f336d9-f8ca-4a0a-ab3f-a23e30c94590" containerName="registry-server" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.917065 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.919601 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.919902 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.920138 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.920299 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.920446 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.920707 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:05:51 crc kubenswrapper[4886]: I0314 09:05:51.928363 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx"] Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.060056 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.060127 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7g79\" (UniqueName: \"kubernetes.io/projected/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-kube-api-access-w7g79\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.060159 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.060241 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.060283 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.060348 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.162273 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.162378 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7g79\" (UniqueName: \"kubernetes.io/projected/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-kube-api-access-w7g79\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.162433 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.162484 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.162508 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.162551 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.166405 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.166909 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.167847 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.168048 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.180661 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.183650 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7g79\" (UniqueName: \"kubernetes.io/projected/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-kube-api-access-w7g79\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.240635 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.771840 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx"] Mar 14 09:05:52 crc kubenswrapper[4886]: W0314 09:05:52.773412 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca30d3c2_97e8_4ade_b4c8_b737a405c62f.slice/crio-ad9c5688410fc71c1ca91126fe537e2b63d13012639dff60b5c8fdcb001a5798 WatchSource:0}: Error finding container ad9c5688410fc71c1ca91126fe537e2b63d13012639dff60b5c8fdcb001a5798: Status 404 returned error can't find the container with id ad9c5688410fc71c1ca91126fe537e2b63d13012639dff60b5c8fdcb001a5798 Mar 14 09:05:52 crc kubenswrapper[4886]: I0314 09:05:52.828641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" event={"ID":"ca30d3c2-97e8-4ade-b4c8-b737a405c62f","Type":"ContainerStarted","Data":"ad9c5688410fc71c1ca91126fe537e2b63d13012639dff60b5c8fdcb001a5798"} Mar 14 09:05:53 crc kubenswrapper[4886]: I0314 09:05:53.841322 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" event={"ID":"ca30d3c2-97e8-4ade-b4c8-b737a405c62f","Type":"ContainerStarted","Data":"5f8f9db4d440089081b61d0d6d009f7cdcb3a43e4dc97b098a5ba3a8d470e6b0"} Mar 14 09:05:53 crc kubenswrapper[4886]: I0314 09:05:53.868646 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" podStartSLOduration=2.37621227 podStartE2EDuration="2.868621831s" podCreationTimestamp="2026-03-14 09:05:51 +0000 UTC" firstStartedPulling="2026-03-14 09:05:52.777698203 +0000 UTC m=+2288.026149880" lastFinishedPulling="2026-03-14 09:05:53.270107784 +0000 UTC m=+2288.518559441" observedRunningTime="2026-03-14 09:05:53.862477268 +0000 UTC m=+2289.110928915" watchObservedRunningTime="2026-03-14 09:05:53.868621831 +0000 UTC m=+2289.117073478" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.138825 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557986-6l2sx"] Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.141693 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.145374 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.145624 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.145742 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.148471 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-6l2sx"] Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.268745 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpznc\" (UniqueName: \"kubernetes.io/projected/f10a97b1-2c18-4685-b178-d7b8dc0495b5-kube-api-access-xpznc\") pod \"auto-csr-approver-29557986-6l2sx\" (UID: \"f10a97b1-2c18-4685-b178-d7b8dc0495b5\") " pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.371173 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpznc\" (UniqueName: \"kubernetes.io/projected/f10a97b1-2c18-4685-b178-d7b8dc0495b5-kube-api-access-xpznc\") pod \"auto-csr-approver-29557986-6l2sx\" (UID: \"f10a97b1-2c18-4685-b178-d7b8dc0495b5\") " pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.395360 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpznc\" (UniqueName: \"kubernetes.io/projected/f10a97b1-2c18-4685-b178-d7b8dc0495b5-kube-api-access-xpznc\") pod \"auto-csr-approver-29557986-6l2sx\" (UID: \"f10a97b1-2c18-4685-b178-d7b8dc0495b5\") " pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.468644 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:00 crc kubenswrapper[4886]: I0314 09:06:00.949626 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-6l2sx"] Mar 14 09:06:01 crc kubenswrapper[4886]: I0314 09:06:01.925111 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" event={"ID":"f10a97b1-2c18-4685-b178-d7b8dc0495b5","Type":"ContainerStarted","Data":"461cec9e886c9133897106d0ef1c5db965a2cfa2f8b8115b0035534058a2191a"} Mar 14 09:06:02 crc kubenswrapper[4886]: I0314 09:06:02.934757 4886 generic.go:334] "Generic (PLEG): container finished" podID="f10a97b1-2c18-4685-b178-d7b8dc0495b5" containerID="f7b82c0b133a837909b1b6d8e7597b6aa4a99099617f6893f961daf684398b58" exitCode=0 Mar 14 09:06:02 crc kubenswrapper[4886]: I0314 09:06:02.934961 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" event={"ID":"f10a97b1-2c18-4685-b178-d7b8dc0495b5","Type":"ContainerDied","Data":"f7b82c0b133a837909b1b6d8e7597b6aa4a99099617f6893f961daf684398b58"} Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.301236 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.498876 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpznc\" (UniqueName: \"kubernetes.io/projected/f10a97b1-2c18-4685-b178-d7b8dc0495b5-kube-api-access-xpznc\") pod \"f10a97b1-2c18-4685-b178-d7b8dc0495b5\" (UID: \"f10a97b1-2c18-4685-b178-d7b8dc0495b5\") " Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.506180 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10a97b1-2c18-4685-b178-d7b8dc0495b5-kube-api-access-xpznc" (OuterVolumeSpecName: "kube-api-access-xpznc") pod "f10a97b1-2c18-4685-b178-d7b8dc0495b5" (UID: "f10a97b1-2c18-4685-b178-d7b8dc0495b5"). InnerVolumeSpecName "kube-api-access-xpznc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.601413 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpznc\" (UniqueName: \"kubernetes.io/projected/f10a97b1-2c18-4685-b178-d7b8dc0495b5-kube-api-access-xpznc\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.961335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" event={"ID":"f10a97b1-2c18-4685-b178-d7b8dc0495b5","Type":"ContainerDied","Data":"461cec9e886c9133897106d0ef1c5db965a2cfa2f8b8115b0035534058a2191a"} Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.961377 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="461cec9e886c9133897106d0ef1c5db965a2cfa2f8b8115b0035534058a2191a" Mar 14 09:06:04 crc kubenswrapper[4886]: I0314 09:06:04.961432 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557986-6l2sx" Mar 14 09:06:05 crc kubenswrapper[4886]: I0314 09:06:05.370212 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-4tvv2"] Mar 14 09:06:05 crc kubenswrapper[4886]: I0314 09:06:05.384000 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557980-4tvv2"] Mar 14 09:06:05 crc kubenswrapper[4886]: I0314 09:06:05.426561 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:06:05 crc kubenswrapper[4886]: E0314 09:06:05.426828 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:06:05 crc kubenswrapper[4886]: I0314 09:06:05.433141 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e154ca81-98b6-4a74-b011-41b224074571" path="/var/lib/kubelet/pods/e154ca81-98b6-4a74-b011-41b224074571/volumes" Mar 14 09:06:18 crc kubenswrapper[4886]: I0314 09:06:18.421057 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:06:18 crc kubenswrapper[4886]: E0314 09:06:18.421821 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.851871 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pk8gd"] Mar 14 09:06:28 crc kubenswrapper[4886]: E0314 09:06:28.852831 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10a97b1-2c18-4685-b178-d7b8dc0495b5" containerName="oc" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.852845 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10a97b1-2c18-4685-b178-d7b8dc0495b5" containerName="oc" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.853032 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10a97b1-2c18-4685-b178-d7b8dc0495b5" containerName="oc" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.854537 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.867890 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk8gd"] Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.930351 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-utilities\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.930912 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-catalog-content\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:28 crc kubenswrapper[4886]: I0314 09:06:28.930957 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tms9\" (UniqueName: \"kubernetes.io/projected/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-kube-api-access-5tms9\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.032937 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-catalog-content\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.033000 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tms9\" (UniqueName: \"kubernetes.io/projected/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-kube-api-access-5tms9\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.033082 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-utilities\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.033662 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-catalog-content\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.033697 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-utilities\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.053754 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tms9\" (UniqueName: \"kubernetes.io/projected/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-kube-api-access-5tms9\") pod \"redhat-marketplace-pk8gd\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.223081 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:29 crc kubenswrapper[4886]: I0314 09:06:29.728169 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk8gd"] Mar 14 09:06:30 crc kubenswrapper[4886]: I0314 09:06:30.239205 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerID="f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599" exitCode=0 Mar 14 09:06:30 crc kubenswrapper[4886]: I0314 09:06:30.240411 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerDied","Data":"f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599"} Mar 14 09:06:30 crc kubenswrapper[4886]: I0314 09:06:30.240498 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerStarted","Data":"03f1ad95f7c8c8331e9ed08aa378b9bf5e628b7f094205360f2af8a9886249cb"} Mar 14 09:06:31 crc kubenswrapper[4886]: I0314 09:06:31.252907 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerStarted","Data":"f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20"} Mar 14 09:06:32 crc kubenswrapper[4886]: I0314 09:06:32.265093 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerID="f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20" exitCode=0 Mar 14 09:06:32 crc kubenswrapper[4886]: I0314 09:06:32.265160 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerDied","Data":"f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20"} Mar 14 09:06:33 crc kubenswrapper[4886]: I0314 09:06:33.278297 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerStarted","Data":"e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef"} Mar 14 09:06:33 crc kubenswrapper[4886]: I0314 09:06:33.306684 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pk8gd" podStartSLOduration=2.9016203860000003 podStartE2EDuration="5.306666929s" podCreationTimestamp="2026-03-14 09:06:28 +0000 UTC" firstStartedPulling="2026-03-14 09:06:30.242149675 +0000 UTC m=+2325.490601322" lastFinishedPulling="2026-03-14 09:06:32.647196208 +0000 UTC m=+2327.895647865" observedRunningTime="2026-03-14 09:06:33.297945554 +0000 UTC m=+2328.546397231" watchObservedRunningTime="2026-03-14 09:06:33.306666929 +0000 UTC m=+2328.555118566" Mar 14 09:06:33 crc kubenswrapper[4886]: I0314 09:06:33.422229 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:06:33 crc kubenswrapper[4886]: E0314 09:06:33.422484 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:06:34 crc kubenswrapper[4886]: I0314 09:06:34.728226 4886 scope.go:117] "RemoveContainer" containerID="f50949f54ab90687395fb8da3a725d15ae54c40be266f774fb0925981e9b53a6" Mar 14 09:06:39 crc kubenswrapper[4886]: I0314 09:06:39.223807 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:39 crc kubenswrapper[4886]: I0314 09:06:39.224737 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:39 crc kubenswrapper[4886]: I0314 09:06:39.270461 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:39 crc kubenswrapper[4886]: I0314 09:06:39.380544 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:39 crc kubenswrapper[4886]: I0314 09:06:39.514994 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk8gd"] Mar 14 09:06:41 crc kubenswrapper[4886]: I0314 09:06:41.363237 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pk8gd" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="registry-server" containerID="cri-o://e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef" gracePeriod=2 Mar 14 09:06:41 crc kubenswrapper[4886]: I0314 09:06:41.874246 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.005948 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tms9\" (UniqueName: \"kubernetes.io/projected/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-kube-api-access-5tms9\") pod \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.006142 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-catalog-content\") pod \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.006249 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-utilities\") pod \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\" (UID: \"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1\") " Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.007199 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-utilities" (OuterVolumeSpecName: "utilities") pod "7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" (UID: "7ad49972-08e8-4230-8f26-2fa4e3f7b0b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.015585 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-kube-api-access-5tms9" (OuterVolumeSpecName: "kube-api-access-5tms9") pod "7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" (UID: "7ad49972-08e8-4230-8f26-2fa4e3f7b0b1"). InnerVolumeSpecName "kube-api-access-5tms9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.061364 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" (UID: "7ad49972-08e8-4230-8f26-2fa4e3f7b0b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.109203 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tms9\" (UniqueName: \"kubernetes.io/projected/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-kube-api-access-5tms9\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.109237 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.109247 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.374840 4886 generic.go:334] "Generic (PLEG): container finished" podID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerID="e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef" exitCode=0 Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.374890 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerDied","Data":"e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef"} Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.374896 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pk8gd" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.374919 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pk8gd" event={"ID":"7ad49972-08e8-4230-8f26-2fa4e3f7b0b1","Type":"ContainerDied","Data":"03f1ad95f7c8c8331e9ed08aa378b9bf5e628b7f094205360f2af8a9886249cb"} Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.374940 4886 scope.go:117] "RemoveContainer" containerID="e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.404129 4886 scope.go:117] "RemoveContainer" containerID="f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.434359 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk8gd"] Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.444625 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pk8gd"] Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.445251 4886 scope.go:117] "RemoveContainer" containerID="f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.491804 4886 scope.go:117] "RemoveContainer" containerID="e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef" Mar 14 09:06:42 crc kubenswrapper[4886]: E0314 09:06:42.492399 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef\": container with ID starting with e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef not found: ID does not exist" containerID="e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.492459 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef"} err="failed to get container status \"e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef\": rpc error: code = NotFound desc = could not find container \"e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef\": container with ID starting with e68708e8ecf153bacaa001e25c1d10e60e614baa05c89a6a1562c9740d5561ef not found: ID does not exist" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.492501 4886 scope.go:117] "RemoveContainer" containerID="f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20" Mar 14 09:06:42 crc kubenswrapper[4886]: E0314 09:06:42.492917 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20\": container with ID starting with f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20 not found: ID does not exist" containerID="f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.492961 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20"} err="failed to get container status \"f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20\": rpc error: code = NotFound desc = could not find container \"f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20\": container with ID starting with f07263b61021fc2ddbc540fee0b0758c405b4171f903ee0b080d196ce8b1cd20 not found: ID does not exist" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.492989 4886 scope.go:117] "RemoveContainer" containerID="f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599" Mar 14 09:06:42 crc kubenswrapper[4886]: E0314 09:06:42.493295 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599\": container with ID starting with f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599 not found: ID does not exist" containerID="f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599" Mar 14 09:06:42 crc kubenswrapper[4886]: I0314 09:06:42.493333 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599"} err="failed to get container status \"f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599\": rpc error: code = NotFound desc = could not find container \"f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599\": container with ID starting with f64ad0b7cecd849743bd7d9a7300cece2c8b9800aecb01996853198f994cd599 not found: ID does not exist" Mar 14 09:06:43 crc kubenswrapper[4886]: I0314 09:06:43.432448 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" path="/var/lib/kubelet/pods/7ad49972-08e8-4230-8f26-2fa4e3f7b0b1/volumes" Mar 14 09:06:45 crc kubenswrapper[4886]: I0314 09:06:45.406331 4886 generic.go:334] "Generic (PLEG): container finished" podID="ca30d3c2-97e8-4ade-b4c8-b737a405c62f" containerID="5f8f9db4d440089081b61d0d6d009f7cdcb3a43e4dc97b098a5ba3a8d470e6b0" exitCode=0 Mar 14 09:06:45 crc kubenswrapper[4886]: I0314 09:06:45.406420 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" event={"ID":"ca30d3c2-97e8-4ade-b4c8-b737a405c62f","Type":"ContainerDied","Data":"5f8f9db4d440089081b61d0d6d009f7cdcb3a43e4dc97b098a5ba3a8d470e6b0"} Mar 14 09:06:46 crc kubenswrapper[4886]: I0314 09:06:46.868619 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.007980 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-ssh-key-openstack-edpm-ipam\") pod \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.008053 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-nova-metadata-neutron-config-0\") pod \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.008219 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-metadata-combined-ca-bundle\") pod \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.008282 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7g79\" (UniqueName: \"kubernetes.io/projected/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-kube-api-access-w7g79\") pod \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.008311 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.008361 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-inventory\") pod \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\" (UID: \"ca30d3c2-97e8-4ade-b4c8-b737a405c62f\") " Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.013264 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ca30d3c2-97e8-4ade-b4c8-b737a405c62f" (UID: "ca30d3c2-97e8-4ade-b4c8-b737a405c62f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.015797 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-kube-api-access-w7g79" (OuterVolumeSpecName: "kube-api-access-w7g79") pod "ca30d3c2-97e8-4ade-b4c8-b737a405c62f" (UID: "ca30d3c2-97e8-4ade-b4c8-b737a405c62f"). InnerVolumeSpecName "kube-api-access-w7g79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.035062 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ca30d3c2-97e8-4ade-b4c8-b737a405c62f" (UID: "ca30d3c2-97e8-4ade-b4c8-b737a405c62f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.038775 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ca30d3c2-97e8-4ade-b4c8-b737a405c62f" (UID: "ca30d3c2-97e8-4ade-b4c8-b737a405c62f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.039571 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-inventory" (OuterVolumeSpecName: "inventory") pod "ca30d3c2-97e8-4ade-b4c8-b737a405c62f" (UID: "ca30d3c2-97e8-4ade-b4c8-b737a405c62f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.040677 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca30d3c2-97e8-4ade-b4c8-b737a405c62f" (UID: "ca30d3c2-97e8-4ade-b4c8-b737a405c62f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.110508 4886 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.110795 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7g79\" (UniqueName: \"kubernetes.io/projected/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-kube-api-access-w7g79\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.110807 4886 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.110818 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.110829 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.110843 4886 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ca30d3c2-97e8-4ade-b4c8-b737a405c62f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.421354 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:06:47 crc kubenswrapper[4886]: E0314 09:06:47.421755 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.438813 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.438928 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx" event={"ID":"ca30d3c2-97e8-4ade-b4c8-b737a405c62f","Type":"ContainerDied","Data":"ad9c5688410fc71c1ca91126fe537e2b63d13012639dff60b5c8fdcb001a5798"} Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.438965 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9c5688410fc71c1ca91126fe537e2b63d13012639dff60b5c8fdcb001a5798" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.636110 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426"] Mar 14 09:06:47 crc kubenswrapper[4886]: E0314 09:06:47.636667 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="extract-utilities" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.636688 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="extract-utilities" Mar 14 09:06:47 crc kubenswrapper[4886]: E0314 09:06:47.636715 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca30d3c2-97e8-4ade-b4c8-b737a405c62f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.636726 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca30d3c2-97e8-4ade-b4c8-b737a405c62f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 14 09:06:47 crc kubenswrapper[4886]: E0314 09:06:47.636751 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="extract-content" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.636759 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="extract-content" Mar 14 09:06:47 crc kubenswrapper[4886]: E0314 09:06:47.636778 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="registry-server" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.636787 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="registry-server" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.637043 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad49972-08e8-4230-8f26-2fa4e3f7b0b1" containerName="registry-server" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.637078 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca30d3c2-97e8-4ade-b4c8-b737a405c62f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.637922 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.640158 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.640427 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.640617 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.641212 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.643886 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.652751 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426"] Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.721895 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zvqw\" (UniqueName: \"kubernetes.io/projected/abe350ea-5335-4b70-8de1-b33c2c17c876-kube-api-access-2zvqw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.721992 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.722035 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.722085 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.722143 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.824252 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.824354 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.824440 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zvqw\" (UniqueName: \"kubernetes.io/projected/abe350ea-5335-4b70-8de1-b33c2c17c876-kube-api-access-2zvqw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.824504 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.824545 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.829205 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.829298 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.829341 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.830710 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.843811 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zvqw\" (UniqueName: \"kubernetes.io/projected/abe350ea-5335-4b70-8de1-b33c2c17c876-kube-api-access-2zvqw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn426\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:47 crc kubenswrapper[4886]: I0314 09:06:47.968343 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:06:48 crc kubenswrapper[4886]: I0314 09:06:48.470774 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426"] Mar 14 09:06:48 crc kubenswrapper[4886]: W0314 09:06:48.473659 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe350ea_5335_4b70_8de1_b33c2c17c876.slice/crio-e963bba4c62791a1aeb61e90a1728af97379a1980749d58b4010d9c2ceaed96d WatchSource:0}: Error finding container e963bba4c62791a1aeb61e90a1728af97379a1980749d58b4010d9c2ceaed96d: Status 404 returned error can't find the container with id e963bba4c62791a1aeb61e90a1728af97379a1980749d58b4010d9c2ceaed96d Mar 14 09:06:49 crc kubenswrapper[4886]: I0314 09:06:49.458734 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" event={"ID":"abe350ea-5335-4b70-8de1-b33c2c17c876","Type":"ContainerStarted","Data":"2d53af764524ada102029383885df2102915df91d028a2f95dc13b3b313c93c2"} Mar 14 09:06:49 crc kubenswrapper[4886]: I0314 09:06:49.459296 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" event={"ID":"abe350ea-5335-4b70-8de1-b33c2c17c876","Type":"ContainerStarted","Data":"e963bba4c62791a1aeb61e90a1728af97379a1980749d58b4010d9c2ceaed96d"} Mar 14 09:06:49 crc kubenswrapper[4886]: I0314 09:06:49.491340 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" podStartSLOduration=2.050420515 podStartE2EDuration="2.491318778s" podCreationTimestamp="2026-03-14 09:06:47 +0000 UTC" firstStartedPulling="2026-03-14 09:06:48.476025968 +0000 UTC m=+2343.724477605" lastFinishedPulling="2026-03-14 09:06:48.916924231 +0000 UTC m=+2344.165375868" observedRunningTime="2026-03-14 09:06:49.477142299 +0000 UTC m=+2344.725593936" watchObservedRunningTime="2026-03-14 09:06:49.491318778 +0000 UTC m=+2344.739770415" Mar 14 09:06:58 crc kubenswrapper[4886]: I0314 09:06:58.421349 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:06:58 crc kubenswrapper[4886]: E0314 09:06:58.422235 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:07:12 crc kubenswrapper[4886]: I0314 09:07:12.421381 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:07:12 crc kubenswrapper[4886]: E0314 09:07:12.422329 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:07:26 crc kubenswrapper[4886]: I0314 09:07:26.420505 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:07:26 crc kubenswrapper[4886]: E0314 09:07:26.421539 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:07:37 crc kubenswrapper[4886]: I0314 09:07:37.420841 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:07:37 crc kubenswrapper[4886]: E0314 09:07:37.421788 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.100711 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b8h7f"] Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.104027 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.153816 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b8h7f"] Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.261309 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-catalog-content\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.261658 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpn2z\" (UniqueName: \"kubernetes.io/projected/362f8a42-8f4c-4fea-89ac-135ec83a93a3-kube-api-access-qpn2z\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.261955 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-utilities\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.363687 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-utilities\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.363801 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-catalog-content\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.363853 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpn2z\" (UniqueName: \"kubernetes.io/projected/362f8a42-8f4c-4fea-89ac-135ec83a93a3-kube-api-access-qpn2z\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.364307 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-utilities\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.364466 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-catalog-content\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.384857 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpn2z\" (UniqueName: \"kubernetes.io/projected/362f8a42-8f4c-4fea-89ac-135ec83a93a3-kube-api-access-qpn2z\") pod \"redhat-operators-b8h7f\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:40 crc kubenswrapper[4886]: I0314 09:07:40.496549 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:41 crc kubenswrapper[4886]: I0314 09:07:40.979843 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b8h7f"] Mar 14 09:07:41 crc kubenswrapper[4886]: I0314 09:07:41.007032 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerStarted","Data":"b544f86a1f6547a20c5def4373c1b67bf18a3fd0badfcb85d1b6cdf05ab89ce9"} Mar 14 09:07:42 crc kubenswrapper[4886]: I0314 09:07:42.021025 4886 generic.go:334] "Generic (PLEG): container finished" podID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerID="7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d" exitCode=0 Mar 14 09:07:42 crc kubenswrapper[4886]: I0314 09:07:42.021207 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerDied","Data":"7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d"} Mar 14 09:07:43 crc kubenswrapper[4886]: I0314 09:07:43.031382 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerStarted","Data":"b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e"} Mar 14 09:07:46 crc kubenswrapper[4886]: I0314 09:07:46.068842 4886 generic.go:334] "Generic (PLEG): container finished" podID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerID="b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e" exitCode=0 Mar 14 09:07:46 crc kubenswrapper[4886]: I0314 09:07:46.068914 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerDied","Data":"b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e"} Mar 14 09:07:48 crc kubenswrapper[4886]: I0314 09:07:48.095377 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerStarted","Data":"b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f"} Mar 14 09:07:48 crc kubenswrapper[4886]: I0314 09:07:48.135086 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b8h7f" podStartSLOduration=3.071918476 podStartE2EDuration="8.135059893s" podCreationTimestamp="2026-03-14 09:07:40 +0000 UTC" firstStartedPulling="2026-03-14 09:07:42.024163501 +0000 UTC m=+2397.272615138" lastFinishedPulling="2026-03-14 09:07:47.087304918 +0000 UTC m=+2402.335756555" observedRunningTime="2026-03-14 09:07:48.119964225 +0000 UTC m=+2403.368415912" watchObservedRunningTime="2026-03-14 09:07:48.135059893 +0000 UTC m=+2403.383511540" Mar 14 09:07:48 crc kubenswrapper[4886]: I0314 09:07:48.421146 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:07:48 crc kubenswrapper[4886]: E0314 09:07:48.421895 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:07:50 crc kubenswrapper[4886]: I0314 09:07:50.496934 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:50 crc kubenswrapper[4886]: I0314 09:07:50.498899 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:07:51 crc kubenswrapper[4886]: I0314 09:07:51.555382 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b8h7f" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="registry-server" probeResult="failure" output=< Mar 14 09:07:51 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:07:51 crc kubenswrapper[4886]: > Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.152438 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557988-x7fwn"] Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.154196 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.156891 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.157223 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.158716 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.163444 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-x7fwn"] Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.251010 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtgz5\" (UniqueName: \"kubernetes.io/projected/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c-kube-api-access-dtgz5\") pod \"auto-csr-approver-29557988-x7fwn\" (UID: \"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c\") " pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.353606 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtgz5\" (UniqueName: \"kubernetes.io/projected/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c-kube-api-access-dtgz5\") pod \"auto-csr-approver-29557988-x7fwn\" (UID: \"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c\") " pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.373058 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtgz5\" (UniqueName: \"kubernetes.io/projected/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c-kube-api-access-dtgz5\") pod \"auto-csr-approver-29557988-x7fwn\" (UID: \"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c\") " pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.500540 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.560180 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.630448 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:08:00 crc kubenswrapper[4886]: I0314 09:08:00.812427 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b8h7f"] Mar 14 09:08:01 crc kubenswrapper[4886]: I0314 09:08:01.007241 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-x7fwn"] Mar 14 09:08:01 crc kubenswrapper[4886]: I0314 09:08:01.241387 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" event={"ID":"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c","Type":"ContainerStarted","Data":"788b2783706f14e110828f72b3ca01cabc4f308a7fef5d4cea52df71e143648e"} Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.251923 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b8h7f" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="registry-server" containerID="cri-o://b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f" gracePeriod=2 Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.253282 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" event={"ID":"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c","Type":"ContainerStarted","Data":"ce8c6074c0277f4bb45c6dc56204caaed3c3a9ffad3d0f63b9747266036dd356"} Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.281098 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" podStartSLOduration=1.499555729 podStartE2EDuration="2.28106506s" podCreationTimestamp="2026-03-14 09:08:00 +0000 UTC" firstStartedPulling="2026-03-14 09:08:01.019062086 +0000 UTC m=+2416.267513763" lastFinishedPulling="2026-03-14 09:08:01.800571457 +0000 UTC m=+2417.049023094" observedRunningTime="2026-03-14 09:08:02.265817557 +0000 UTC m=+2417.514269204" watchObservedRunningTime="2026-03-14 09:08:02.28106506 +0000 UTC m=+2417.529516747" Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.715963 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.805808 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpn2z\" (UniqueName: \"kubernetes.io/projected/362f8a42-8f4c-4fea-89ac-135ec83a93a3-kube-api-access-qpn2z\") pod \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.805970 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-utilities\") pod \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.806062 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-catalog-content\") pod \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\" (UID: \"362f8a42-8f4c-4fea-89ac-135ec83a93a3\") " Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.810735 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362f8a42-8f4c-4fea-89ac-135ec83a93a3-kube-api-access-qpn2z" (OuterVolumeSpecName: "kube-api-access-qpn2z") pod "362f8a42-8f4c-4fea-89ac-135ec83a93a3" (UID: "362f8a42-8f4c-4fea-89ac-135ec83a93a3"). InnerVolumeSpecName "kube-api-access-qpn2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.814593 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-utilities" (OuterVolumeSpecName: "utilities") pod "362f8a42-8f4c-4fea-89ac-135ec83a93a3" (UID: "362f8a42-8f4c-4fea-89ac-135ec83a93a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.912718 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpn2z\" (UniqueName: \"kubernetes.io/projected/362f8a42-8f4c-4fea-89ac-135ec83a93a3-kube-api-access-qpn2z\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.913033 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:02 crc kubenswrapper[4886]: I0314 09:08:02.955294 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "362f8a42-8f4c-4fea-89ac-135ec83a93a3" (UID: "362f8a42-8f4c-4fea-89ac-135ec83a93a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.014863 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362f8a42-8f4c-4fea-89ac-135ec83a93a3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.267361 4886 generic.go:334] "Generic (PLEG): container finished" podID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerID="b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f" exitCode=0 Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.267444 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerDied","Data":"b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f"} Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.267496 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8h7f" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.267508 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8h7f" event={"ID":"362f8a42-8f4c-4fea-89ac-135ec83a93a3","Type":"ContainerDied","Data":"b544f86a1f6547a20c5def4373c1b67bf18a3fd0badfcb85d1b6cdf05ab89ce9"} Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.267542 4886 scope.go:117] "RemoveContainer" containerID="b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.271108 4886 generic.go:334] "Generic (PLEG): container finished" podID="93af3bcb-09d4-4b5e-9347-68dcfc6cff6c" containerID="ce8c6074c0277f4bb45c6dc56204caaed3c3a9ffad3d0f63b9747266036dd356" exitCode=0 Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.271191 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" event={"ID":"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c","Type":"ContainerDied","Data":"ce8c6074c0277f4bb45c6dc56204caaed3c3a9ffad3d0f63b9747266036dd356"} Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.298903 4886 scope.go:117] "RemoveContainer" containerID="b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.334892 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b8h7f"] Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.347676 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b8h7f"] Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.355460 4886 scope.go:117] "RemoveContainer" containerID="7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.396244 4886 scope.go:117] "RemoveContainer" containerID="b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f" Mar 14 09:08:03 crc kubenswrapper[4886]: E0314 09:08:03.396703 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f\": container with ID starting with b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f not found: ID does not exist" containerID="b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.396759 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f"} err="failed to get container status \"b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f\": rpc error: code = NotFound desc = could not find container \"b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f\": container with ID starting with b8d46138643e1af8f625a5d46820aacc225d81d6ad95a83aee1c7db5c0f3bf7f not found: ID does not exist" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.396796 4886 scope.go:117] "RemoveContainer" containerID="b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e" Mar 14 09:08:03 crc kubenswrapper[4886]: E0314 09:08:03.397157 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e\": container with ID starting with b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e not found: ID does not exist" containerID="b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.397236 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e"} err="failed to get container status \"b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e\": rpc error: code = NotFound desc = could not find container \"b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e\": container with ID starting with b29b69efd208f26ddfb6a2bea18cff881530e661e471f1966efa73bf86813c1e not found: ID does not exist" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.397263 4886 scope.go:117] "RemoveContainer" containerID="7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d" Mar 14 09:08:03 crc kubenswrapper[4886]: E0314 09:08:03.397650 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d\": container with ID starting with 7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d not found: ID does not exist" containerID="7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.397714 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d"} err="failed to get container status \"7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d\": rpc error: code = NotFound desc = could not find container \"7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d\": container with ID starting with 7b28828eae24ca42b4a57ed640b32cf04b4c34f5b810d639910620a7c36f0d6d not found: ID does not exist" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.421217 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:08:03 crc kubenswrapper[4886]: E0314 09:08:03.421488 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:08:03 crc kubenswrapper[4886]: I0314 09:08:03.432507 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" path="/var/lib/kubelet/pods/362f8a42-8f4c-4fea-89ac-135ec83a93a3/volumes" Mar 14 09:08:04 crc kubenswrapper[4886]: I0314 09:08:04.743831 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:04 crc kubenswrapper[4886]: I0314 09:08:04.858161 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtgz5\" (UniqueName: \"kubernetes.io/projected/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c-kube-api-access-dtgz5\") pod \"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c\" (UID: \"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c\") " Mar 14 09:08:04 crc kubenswrapper[4886]: I0314 09:08:04.870398 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c-kube-api-access-dtgz5" (OuterVolumeSpecName: "kube-api-access-dtgz5") pod "93af3bcb-09d4-4b5e-9347-68dcfc6cff6c" (UID: "93af3bcb-09d4-4b5e-9347-68dcfc6cff6c"). InnerVolumeSpecName "kube-api-access-dtgz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:08:04 crc kubenswrapper[4886]: I0314 09:08:04.961835 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtgz5\" (UniqueName: \"kubernetes.io/projected/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c-kube-api-access-dtgz5\") on node \"crc\" DevicePath \"\"" Mar 14 09:08:05 crc kubenswrapper[4886]: I0314 09:08:05.300579 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" event={"ID":"93af3bcb-09d4-4b5e-9347-68dcfc6cff6c","Type":"ContainerDied","Data":"788b2783706f14e110828f72b3ca01cabc4f308a7fef5d4cea52df71e143648e"} Mar 14 09:08:05 crc kubenswrapper[4886]: I0314 09:08:05.300934 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="788b2783706f14e110828f72b3ca01cabc4f308a7fef5d4cea52df71e143648e" Mar 14 09:08:05 crc kubenswrapper[4886]: I0314 09:08:05.300743 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557988-x7fwn" Mar 14 09:08:05 crc kubenswrapper[4886]: I0314 09:08:05.351257 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-gxdxv"] Mar 14 09:08:05 crc kubenswrapper[4886]: I0314 09:08:05.360496 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557982-gxdxv"] Mar 14 09:08:05 crc kubenswrapper[4886]: I0314 09:08:05.432174 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9ef94d-9b0c-4f1d-a487-bcfc42336213" path="/var/lib/kubelet/pods/fb9ef94d-9b0c-4f1d-a487-bcfc42336213/volumes" Mar 14 09:08:18 crc kubenswrapper[4886]: I0314 09:08:18.421049 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:08:18 crc kubenswrapper[4886]: E0314 09:08:18.421875 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:08:30 crc kubenswrapper[4886]: I0314 09:08:30.421647 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:08:30 crc kubenswrapper[4886]: E0314 09:08:30.422737 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:08:34 crc kubenswrapper[4886]: I0314 09:08:34.841306 4886 scope.go:117] "RemoveContainer" containerID="076e3be3787fc0beaf6adeee0ce5ed91176242689c6ff8a1a8f964319f058d76" Mar 14 09:08:44 crc kubenswrapper[4886]: I0314 09:08:44.421872 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:08:44 crc kubenswrapper[4886]: E0314 09:08:44.423202 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:08:57 crc kubenswrapper[4886]: I0314 09:08:57.421337 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:08:57 crc kubenswrapper[4886]: E0314 09:08:57.422152 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:09:09 crc kubenswrapper[4886]: I0314 09:09:09.420802 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:09:09 crc kubenswrapper[4886]: E0314 09:09:09.421658 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:09:20 crc kubenswrapper[4886]: I0314 09:09:20.421276 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:09:20 crc kubenswrapper[4886]: E0314 09:09:20.422796 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:09:31 crc kubenswrapper[4886]: I0314 09:09:31.421870 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:09:31 crc kubenswrapper[4886]: E0314 09:09:31.422867 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:09:43 crc kubenswrapper[4886]: I0314 09:09:43.421565 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:09:43 crc kubenswrapper[4886]: E0314 09:09:43.422410 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.596504 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lcn6x"] Mar 14 09:09:55 crc kubenswrapper[4886]: E0314 09:09:55.598255 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93af3bcb-09d4-4b5e-9347-68dcfc6cff6c" containerName="oc" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.598274 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="93af3bcb-09d4-4b5e-9347-68dcfc6cff6c" containerName="oc" Mar 14 09:09:55 crc kubenswrapper[4886]: E0314 09:09:55.598296 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="registry-server" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.598306 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="registry-server" Mar 14 09:09:55 crc kubenswrapper[4886]: E0314 09:09:55.598365 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="extract-content" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.598375 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="extract-content" Mar 14 09:09:55 crc kubenswrapper[4886]: E0314 09:09:55.598416 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="extract-utilities" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.598428 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="extract-utilities" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.598956 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="362f8a42-8f4c-4fea-89ac-135ec83a93a3" containerName="registry-server" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.599001 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="93af3bcb-09d4-4b5e-9347-68dcfc6cff6c" containerName="oc" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.602730 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.626313 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcn6x"] Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.695312 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-utilities\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.695707 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2cq\" (UniqueName: \"kubernetes.io/projected/c2e69b15-dd88-46fd-b2d6-042ad81c9862-kube-api-access-6w2cq\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.695879 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-catalog-content\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.798259 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-utilities\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.798360 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2cq\" (UniqueName: \"kubernetes.io/projected/c2e69b15-dd88-46fd-b2d6-042ad81c9862-kube-api-access-6w2cq\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.798390 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-catalog-content\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.798993 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-utilities\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.799103 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-catalog-content\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.819738 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2cq\" (UniqueName: \"kubernetes.io/projected/c2e69b15-dd88-46fd-b2d6-042ad81c9862-kube-api-access-6w2cq\") pod \"community-operators-lcn6x\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:55 crc kubenswrapper[4886]: I0314 09:09:55.948468 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:09:56 crc kubenswrapper[4886]: I0314 09:09:56.422016 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:09:56 crc kubenswrapper[4886]: E0314 09:09:56.422908 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:09:56 crc kubenswrapper[4886]: I0314 09:09:56.541190 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcn6x"] Mar 14 09:09:56 crc kubenswrapper[4886]: W0314 09:09:56.553511 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e69b15_dd88_46fd_b2d6_042ad81c9862.slice/crio-4d2ebbdce0ae6c5bae7981d05066ebb51c575634903486982f66eb8f92d12d9c WatchSource:0}: Error finding container 4d2ebbdce0ae6c5bae7981d05066ebb51c575634903486982f66eb8f92d12d9c: Status 404 returned error can't find the container with id 4d2ebbdce0ae6c5bae7981d05066ebb51c575634903486982f66eb8f92d12d9c Mar 14 09:09:56 crc kubenswrapper[4886]: I0314 09:09:56.711331 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerStarted","Data":"4d2ebbdce0ae6c5bae7981d05066ebb51c575634903486982f66eb8f92d12d9c"} Mar 14 09:09:57 crc kubenswrapper[4886]: I0314 09:09:57.725334 4886 generic.go:334] "Generic (PLEG): container finished" podID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerID="a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed" exitCode=0 Mar 14 09:09:57 crc kubenswrapper[4886]: I0314 09:09:57.725445 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerDied","Data":"a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed"} Mar 14 09:09:57 crc kubenswrapper[4886]: I0314 09:09:57.728979 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:09:58 crc kubenswrapper[4886]: I0314 09:09:58.737964 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerStarted","Data":"bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf"} Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.143501 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557990-vd8jl"] Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.145355 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.149227 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.149923 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.150198 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.153358 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-vd8jl"] Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.196092 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhxs\" (UniqueName: \"kubernetes.io/projected/ae09f2e7-cd2e-4c5e-80dc-350251531ac0-kube-api-access-8nhxs\") pod \"auto-csr-approver-29557990-vd8jl\" (UID: \"ae09f2e7-cd2e-4c5e-80dc-350251531ac0\") " pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.298680 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhxs\" (UniqueName: \"kubernetes.io/projected/ae09f2e7-cd2e-4c5e-80dc-350251531ac0-kube-api-access-8nhxs\") pod \"auto-csr-approver-29557990-vd8jl\" (UID: \"ae09f2e7-cd2e-4c5e-80dc-350251531ac0\") " pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.317347 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhxs\" (UniqueName: \"kubernetes.io/projected/ae09f2e7-cd2e-4c5e-80dc-350251531ac0-kube-api-access-8nhxs\") pod \"auto-csr-approver-29557990-vd8jl\" (UID: \"ae09f2e7-cd2e-4c5e-80dc-350251531ac0\") " pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.468944 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.770510 4886 generic.go:334] "Generic (PLEG): container finished" podID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerID="bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf" exitCode=0 Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.770852 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerDied","Data":"bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf"} Mar 14 09:10:00 crc kubenswrapper[4886]: I0314 09:10:00.986927 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-vd8jl"] Mar 14 09:10:00 crc kubenswrapper[4886]: W0314 09:10:00.992285 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae09f2e7_cd2e_4c5e_80dc_350251531ac0.slice/crio-4f1c7ba5da5c5aba53c1fb0056c32f599e5877f01f79138185aa96105b73e5ad WatchSource:0}: Error finding container 4f1c7ba5da5c5aba53c1fb0056c32f599e5877f01f79138185aa96105b73e5ad: Status 404 returned error can't find the container with id 4f1c7ba5da5c5aba53c1fb0056c32f599e5877f01f79138185aa96105b73e5ad Mar 14 09:10:01 crc kubenswrapper[4886]: I0314 09:10:01.781540 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerStarted","Data":"f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8"} Mar 14 09:10:01 crc kubenswrapper[4886]: I0314 09:10:01.783579 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" event={"ID":"ae09f2e7-cd2e-4c5e-80dc-350251531ac0","Type":"ContainerStarted","Data":"4f1c7ba5da5c5aba53c1fb0056c32f599e5877f01f79138185aa96105b73e5ad"} Mar 14 09:10:01 crc kubenswrapper[4886]: I0314 09:10:01.806474 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lcn6x" podStartSLOduration=3.25079063 podStartE2EDuration="6.806449547s" podCreationTimestamp="2026-03-14 09:09:55 +0000 UTC" firstStartedPulling="2026-03-14 09:09:57.728631855 +0000 UTC m=+2532.977083502" lastFinishedPulling="2026-03-14 09:10:01.284290762 +0000 UTC m=+2536.532742419" observedRunningTime="2026-03-14 09:10:01.800104006 +0000 UTC m=+2537.048555653" watchObservedRunningTime="2026-03-14 09:10:01.806449547 +0000 UTC m=+2537.054901184" Mar 14 09:10:02 crc kubenswrapper[4886]: I0314 09:10:02.797043 4886 generic.go:334] "Generic (PLEG): container finished" podID="ae09f2e7-cd2e-4c5e-80dc-350251531ac0" containerID="0f0fb71d276bd706afd34ce15c971645e3d2c6b27e18289bad791da7d3187ad1" exitCode=0 Mar 14 09:10:02 crc kubenswrapper[4886]: I0314 09:10:02.797257 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" event={"ID":"ae09f2e7-cd2e-4c5e-80dc-350251531ac0","Type":"ContainerDied","Data":"0f0fb71d276bd706afd34ce15c971645e3d2c6b27e18289bad791da7d3187ad1"} Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.252525 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.344961 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nhxs\" (UniqueName: \"kubernetes.io/projected/ae09f2e7-cd2e-4c5e-80dc-350251531ac0-kube-api-access-8nhxs\") pod \"ae09f2e7-cd2e-4c5e-80dc-350251531ac0\" (UID: \"ae09f2e7-cd2e-4c5e-80dc-350251531ac0\") " Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.351344 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae09f2e7-cd2e-4c5e-80dc-350251531ac0-kube-api-access-8nhxs" (OuterVolumeSpecName: "kube-api-access-8nhxs") pod "ae09f2e7-cd2e-4c5e-80dc-350251531ac0" (UID: "ae09f2e7-cd2e-4c5e-80dc-350251531ac0"). InnerVolumeSpecName "kube-api-access-8nhxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.447051 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nhxs\" (UniqueName: \"kubernetes.io/projected/ae09f2e7-cd2e-4c5e-80dc-350251531ac0-kube-api-access-8nhxs\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.821401 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" event={"ID":"ae09f2e7-cd2e-4c5e-80dc-350251531ac0","Type":"ContainerDied","Data":"4f1c7ba5da5c5aba53c1fb0056c32f599e5877f01f79138185aa96105b73e5ad"} Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.821446 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f1c7ba5da5c5aba53c1fb0056c32f599e5877f01f79138185aa96105b73e5ad" Mar 14 09:10:04 crc kubenswrapper[4886]: I0314 09:10:04.821528 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557990-vd8jl" Mar 14 09:10:05 crc kubenswrapper[4886]: I0314 09:10:05.350504 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-crnpd"] Mar 14 09:10:05 crc kubenswrapper[4886]: I0314 09:10:05.365073 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557984-crnpd"] Mar 14 09:10:05 crc kubenswrapper[4886]: I0314 09:10:05.439762 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c36cb28-29b5-4c22-8021-6bc4148c11a8" path="/var/lib/kubelet/pods/9c36cb28-29b5-4c22-8021-6bc4148c11a8/volumes" Mar 14 09:10:05 crc kubenswrapper[4886]: I0314 09:10:05.949505 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:10:05 crc kubenswrapper[4886]: I0314 09:10:05.949574 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:10:06 crc kubenswrapper[4886]: I0314 09:10:06.012663 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:10:07 crc kubenswrapper[4886]: I0314 09:10:07.004030 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:10:07 crc kubenswrapper[4886]: I0314 09:10:07.101247 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lcn6x"] Mar 14 09:10:08 crc kubenswrapper[4886]: I0314 09:10:08.421200 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:10:08 crc kubenswrapper[4886]: E0314 09:10:08.422325 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:10:08 crc kubenswrapper[4886]: I0314 09:10:08.892095 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lcn6x" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="registry-server" containerID="cri-o://f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8" gracePeriod=2 Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.493196 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.673415 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w2cq\" (UniqueName: \"kubernetes.io/projected/c2e69b15-dd88-46fd-b2d6-042ad81c9862-kube-api-access-6w2cq\") pod \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.673588 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-utilities\") pod \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.673639 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-catalog-content\") pod \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\" (UID: \"c2e69b15-dd88-46fd-b2d6-042ad81c9862\") " Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.674743 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-utilities" (OuterVolumeSpecName: "utilities") pod "c2e69b15-dd88-46fd-b2d6-042ad81c9862" (UID: "c2e69b15-dd88-46fd-b2d6-042ad81c9862"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.684372 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e69b15-dd88-46fd-b2d6-042ad81c9862-kube-api-access-6w2cq" (OuterVolumeSpecName: "kube-api-access-6w2cq") pod "c2e69b15-dd88-46fd-b2d6-042ad81c9862" (UID: "c2e69b15-dd88-46fd-b2d6-042ad81c9862"). InnerVolumeSpecName "kube-api-access-6w2cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.776515 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w2cq\" (UniqueName: \"kubernetes.io/projected/c2e69b15-dd88-46fd-b2d6-042ad81c9862-kube-api-access-6w2cq\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.776548 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.836637 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2e69b15-dd88-46fd-b2d6-042ad81c9862" (UID: "c2e69b15-dd88-46fd-b2d6-042ad81c9862"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.879157 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e69b15-dd88-46fd-b2d6-042ad81c9862-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.905359 4886 generic.go:334] "Generic (PLEG): container finished" podID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerID="f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8" exitCode=0 Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.905410 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcn6x" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.905430 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerDied","Data":"f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8"} Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.905668 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcn6x" event={"ID":"c2e69b15-dd88-46fd-b2d6-042ad81c9862","Type":"ContainerDied","Data":"4d2ebbdce0ae6c5bae7981d05066ebb51c575634903486982f66eb8f92d12d9c"} Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.905699 4886 scope.go:117] "RemoveContainer" containerID="f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.932934 4886 scope.go:117] "RemoveContainer" containerID="bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf" Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.950186 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lcn6x"] Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.957290 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lcn6x"] Mar 14 09:10:09 crc kubenswrapper[4886]: I0314 09:10:09.988042 4886 scope.go:117] "RemoveContainer" containerID="a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed" Mar 14 09:10:10 crc kubenswrapper[4886]: I0314 09:10:10.016356 4886 scope.go:117] "RemoveContainer" containerID="f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8" Mar 14 09:10:10 crc kubenswrapper[4886]: E0314 09:10:10.017300 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8\": container with ID starting with f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8 not found: ID does not exist" containerID="f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8" Mar 14 09:10:10 crc kubenswrapper[4886]: I0314 09:10:10.017351 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8"} err="failed to get container status \"f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8\": rpc error: code = NotFound desc = could not find container \"f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8\": container with ID starting with f567f7261153819e145655e583842b907b539a3691b53281314870e1a11d82c8 not found: ID does not exist" Mar 14 09:10:10 crc kubenswrapper[4886]: I0314 09:10:10.017379 4886 scope.go:117] "RemoveContainer" containerID="bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf" Mar 14 09:10:10 crc kubenswrapper[4886]: E0314 09:10:10.017904 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf\": container with ID starting with bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf not found: ID does not exist" containerID="bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf" Mar 14 09:10:10 crc kubenswrapper[4886]: I0314 09:10:10.018004 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf"} err="failed to get container status \"bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf\": rpc error: code = NotFound desc = could not find container \"bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf\": container with ID starting with bb5c8d0395b9dde20e1bc417c5bf6c3b3ce16f78aeb531ac8b1caaba30727caf not found: ID does not exist" Mar 14 09:10:10 crc kubenswrapper[4886]: I0314 09:10:10.018101 4886 scope.go:117] "RemoveContainer" containerID="a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed" Mar 14 09:10:10 crc kubenswrapper[4886]: E0314 09:10:10.018619 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed\": container with ID starting with a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed not found: ID does not exist" containerID="a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed" Mar 14 09:10:10 crc kubenswrapper[4886]: I0314 09:10:10.018722 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed"} err="failed to get container status \"a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed\": rpc error: code = NotFound desc = could not find container \"a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed\": container with ID starting with a9068f2643e3a61bf1187820869e07ba8aa6b85514a99d3e7a43f9675554e0ed not found: ID does not exist" Mar 14 09:10:11 crc kubenswrapper[4886]: I0314 09:10:11.442747 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" path="/var/lib/kubelet/pods/c2e69b15-dd88-46fd-b2d6-042ad81c9862/volumes" Mar 14 09:10:23 crc kubenswrapper[4886]: I0314 09:10:23.421149 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:10:23 crc kubenswrapper[4886]: E0314 09:10:23.421965 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:10:34 crc kubenswrapper[4886]: I0314 09:10:34.996540 4886 scope.go:117] "RemoveContainer" containerID="a6f0e5ba8d45d865cf87859862dfbece3b5bbbf921378f3c69d199164f500103" Mar 14 09:10:35 crc kubenswrapper[4886]: I0314 09:10:35.433823 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:10:36 crc kubenswrapper[4886]: I0314 09:10:36.219646 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"a2809cdb5d4b2bdb4e85c97861df551fd5362e454f85257e7b9dae7d20dccedf"} Mar 14 09:11:00 crc kubenswrapper[4886]: I0314 09:11:00.486144 4886 generic.go:334] "Generic (PLEG): container finished" podID="abe350ea-5335-4b70-8de1-b33c2c17c876" containerID="2d53af764524ada102029383885df2102915df91d028a2f95dc13b3b313c93c2" exitCode=0 Mar 14 09:11:00 crc kubenswrapper[4886]: I0314 09:11:00.486336 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" event={"ID":"abe350ea-5335-4b70-8de1-b33c2c17c876","Type":"ContainerDied","Data":"2d53af764524ada102029383885df2102915df91d028a2f95dc13b3b313c93c2"} Mar 14 09:11:01 crc kubenswrapper[4886]: I0314 09:11:01.933641 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.040284 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-secret-0\") pod \"abe350ea-5335-4b70-8de1-b33c2c17c876\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.040348 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zvqw\" (UniqueName: \"kubernetes.io/projected/abe350ea-5335-4b70-8de1-b33c2c17c876-kube-api-access-2zvqw\") pod \"abe350ea-5335-4b70-8de1-b33c2c17c876\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.040479 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-combined-ca-bundle\") pod \"abe350ea-5335-4b70-8de1-b33c2c17c876\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.040571 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-inventory\") pod \"abe350ea-5335-4b70-8de1-b33c2c17c876\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.040677 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-ssh-key-openstack-edpm-ipam\") pod \"abe350ea-5335-4b70-8de1-b33c2c17c876\" (UID: \"abe350ea-5335-4b70-8de1-b33c2c17c876\") " Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.047159 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "abe350ea-5335-4b70-8de1-b33c2c17c876" (UID: "abe350ea-5335-4b70-8de1-b33c2c17c876"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.047942 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe350ea-5335-4b70-8de1-b33c2c17c876-kube-api-access-2zvqw" (OuterVolumeSpecName: "kube-api-access-2zvqw") pod "abe350ea-5335-4b70-8de1-b33c2c17c876" (UID: "abe350ea-5335-4b70-8de1-b33c2c17c876"). InnerVolumeSpecName "kube-api-access-2zvqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.072084 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "abe350ea-5335-4b70-8de1-b33c2c17c876" (UID: "abe350ea-5335-4b70-8de1-b33c2c17c876"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.072290 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "abe350ea-5335-4b70-8de1-b33c2c17c876" (UID: "abe350ea-5335-4b70-8de1-b33c2c17c876"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.079307 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-inventory" (OuterVolumeSpecName: "inventory") pod "abe350ea-5335-4b70-8de1-b33c2c17c876" (UID: "abe350ea-5335-4b70-8de1-b33c2c17c876"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.142541 4886 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.142584 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zvqw\" (UniqueName: \"kubernetes.io/projected/abe350ea-5335-4b70-8de1-b33c2c17c876-kube-api-access-2zvqw\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.142601 4886 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.142614 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.142626 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abe350ea-5335-4b70-8de1-b33c2c17c876-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.515023 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" event={"ID":"abe350ea-5335-4b70-8de1-b33c2c17c876","Type":"ContainerDied","Data":"e963bba4c62791a1aeb61e90a1728af97379a1980749d58b4010d9c2ceaed96d"} Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.515407 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e963bba4c62791a1aeb61e90a1728af97379a1980749d58b4010d9c2ceaed96d" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.515082 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn426" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.632718 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2"] Mar 14 09:11:02 crc kubenswrapper[4886]: E0314 09:11:02.633245 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="extract-content" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633268 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="extract-content" Mar 14 09:11:02 crc kubenswrapper[4886]: E0314 09:11:02.633296 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae09f2e7-cd2e-4c5e-80dc-350251531ac0" containerName="oc" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633306 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae09f2e7-cd2e-4c5e-80dc-350251531ac0" containerName="oc" Mar 14 09:11:02 crc kubenswrapper[4886]: E0314 09:11:02.633321 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="extract-utilities" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633330 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="extract-utilities" Mar 14 09:11:02 crc kubenswrapper[4886]: E0314 09:11:02.633350 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="registry-server" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633358 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="registry-server" Mar 14 09:11:02 crc kubenswrapper[4886]: E0314 09:11:02.633397 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe350ea-5335-4b70-8de1-b33c2c17c876" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633407 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe350ea-5335-4b70-8de1-b33c2c17c876" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633673 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe350ea-5335-4b70-8de1-b33c2c17c876" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633704 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e69b15-dd88-46fd-b2d6-042ad81c9862" containerName="registry-server" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.633721 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae09f2e7-cd2e-4c5e-80dc-350251531ac0" containerName="oc" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.634619 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.637774 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.638090 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.639959 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.640080 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.639960 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.640274 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.640479 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.656398 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2"] Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658179 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658304 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658413 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658506 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wl9\" (UniqueName: \"kubernetes.io/projected/a2fc60c8-e486-4550-8b18-92a57ff62194-kube-api-access-w2wl9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658542 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658565 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658691 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658740 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658780 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658825 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.658844 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761441 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761555 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761600 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761661 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761694 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761751 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761798 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761834 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761882 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wl9\" (UniqueName: \"kubernetes.io/projected/a2fc60c8-e486-4550-8b18-92a57ff62194-kube-api-access-w2wl9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761908 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.761929 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.763658 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.767547 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.767707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.768022 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.768164 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.768478 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.769460 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.769657 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.771106 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.772001 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.785757 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wl9\" (UniqueName: \"kubernetes.io/projected/a2fc60c8-e486-4550-8b18-92a57ff62194-kube-api-access-w2wl9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dl7z2\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:02 crc kubenswrapper[4886]: I0314 09:11:02.955437 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:11:03 crc kubenswrapper[4886]: I0314 09:11:03.581252 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2"] Mar 14 09:11:04 crc kubenswrapper[4886]: I0314 09:11:04.549738 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" event={"ID":"a2fc60c8-e486-4550-8b18-92a57ff62194","Type":"ContainerStarted","Data":"79f528ee003a5ae0e78f088e2f81d010fc80619e8ba317a846e5616ce3bb7db6"} Mar 14 09:11:04 crc kubenswrapper[4886]: I0314 09:11:04.551353 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" event={"ID":"a2fc60c8-e486-4550-8b18-92a57ff62194","Type":"ContainerStarted","Data":"85c092d3eeebd11aab3542e963c8e93ac9b17f938a6a552837edfaf8cdc3eb68"} Mar 14 09:11:04 crc kubenswrapper[4886]: I0314 09:11:04.595334 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" podStartSLOduration=2.106592613 podStartE2EDuration="2.595295239s" podCreationTimestamp="2026-03-14 09:11:02 +0000 UTC" firstStartedPulling="2026-03-14 09:11:03.594706062 +0000 UTC m=+2598.843157709" lastFinishedPulling="2026-03-14 09:11:04.083408688 +0000 UTC m=+2599.331860335" observedRunningTime="2026-03-14 09:11:04.584919355 +0000 UTC m=+2599.833371012" watchObservedRunningTime="2026-03-14 09:11:04.595295239 +0000 UTC m=+2599.843746886" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.149656 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557992-dqcnn"] Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.151661 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.157655 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.157852 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.157974 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.176544 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-dqcnn"] Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.281156 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnh4m\" (UniqueName: \"kubernetes.io/projected/e7e881b5-4abd-4b66-90e3-b364535731ab-kube-api-access-jnh4m\") pod \"auto-csr-approver-29557992-dqcnn\" (UID: \"e7e881b5-4abd-4b66-90e3-b364535731ab\") " pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.383630 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnh4m\" (UniqueName: \"kubernetes.io/projected/e7e881b5-4abd-4b66-90e3-b364535731ab-kube-api-access-jnh4m\") pod \"auto-csr-approver-29557992-dqcnn\" (UID: \"e7e881b5-4abd-4b66-90e3-b364535731ab\") " pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.408358 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnh4m\" (UniqueName: \"kubernetes.io/projected/e7e881b5-4abd-4b66-90e3-b364535731ab-kube-api-access-jnh4m\") pod \"auto-csr-approver-29557992-dqcnn\" (UID: \"e7e881b5-4abd-4b66-90e3-b364535731ab\") " pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.480973 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:00 crc kubenswrapper[4886]: I0314 09:12:00.976540 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-dqcnn"] Mar 14 09:12:00 crc kubenswrapper[4886]: W0314 09:12:00.985750 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7e881b5_4abd_4b66_90e3_b364535731ab.slice/crio-a90789502ab8339af288177eaacac5eb87453805252d7b39bf2924865cfbde68 WatchSource:0}: Error finding container a90789502ab8339af288177eaacac5eb87453805252d7b39bf2924865cfbde68: Status 404 returned error can't find the container with id a90789502ab8339af288177eaacac5eb87453805252d7b39bf2924865cfbde68 Mar 14 09:12:01 crc kubenswrapper[4886]: I0314 09:12:01.215829 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" event={"ID":"e7e881b5-4abd-4b66-90e3-b364535731ab","Type":"ContainerStarted","Data":"a90789502ab8339af288177eaacac5eb87453805252d7b39bf2924865cfbde68"} Mar 14 09:12:03 crc kubenswrapper[4886]: I0314 09:12:03.249015 4886 generic.go:334] "Generic (PLEG): container finished" podID="e7e881b5-4abd-4b66-90e3-b364535731ab" containerID="1218556cba3388dfd7e86e0c4238c0f5a69a54b0dc921f4c9f1b19431152bfd3" exitCode=0 Mar 14 09:12:03 crc kubenswrapper[4886]: I0314 09:12:03.249288 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" event={"ID":"e7e881b5-4abd-4b66-90e3-b364535731ab","Type":"ContainerDied","Data":"1218556cba3388dfd7e86e0c4238c0f5a69a54b0dc921f4c9f1b19431152bfd3"} Mar 14 09:12:04 crc kubenswrapper[4886]: I0314 09:12:04.655182 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:04 crc kubenswrapper[4886]: I0314 09:12:04.813594 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnh4m\" (UniqueName: \"kubernetes.io/projected/e7e881b5-4abd-4b66-90e3-b364535731ab-kube-api-access-jnh4m\") pod \"e7e881b5-4abd-4b66-90e3-b364535731ab\" (UID: \"e7e881b5-4abd-4b66-90e3-b364535731ab\") " Mar 14 09:12:04 crc kubenswrapper[4886]: I0314 09:12:04.822396 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e881b5-4abd-4b66-90e3-b364535731ab-kube-api-access-jnh4m" (OuterVolumeSpecName: "kube-api-access-jnh4m") pod "e7e881b5-4abd-4b66-90e3-b364535731ab" (UID: "e7e881b5-4abd-4b66-90e3-b364535731ab"). InnerVolumeSpecName "kube-api-access-jnh4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:12:04 crc kubenswrapper[4886]: I0314 09:12:04.916634 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnh4m\" (UniqueName: \"kubernetes.io/projected/e7e881b5-4abd-4b66-90e3-b364535731ab-kube-api-access-jnh4m\") on node \"crc\" DevicePath \"\"" Mar 14 09:12:05 crc kubenswrapper[4886]: I0314 09:12:05.276240 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" event={"ID":"e7e881b5-4abd-4b66-90e3-b364535731ab","Type":"ContainerDied","Data":"a90789502ab8339af288177eaacac5eb87453805252d7b39bf2924865cfbde68"} Mar 14 09:12:05 crc kubenswrapper[4886]: I0314 09:12:05.276307 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90789502ab8339af288177eaacac5eb87453805252d7b39bf2924865cfbde68" Mar 14 09:12:05 crc kubenswrapper[4886]: I0314 09:12:05.276325 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557992-dqcnn" Mar 14 09:12:05 crc kubenswrapper[4886]: I0314 09:12:05.768085 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-6l2sx"] Mar 14 09:12:05 crc kubenswrapper[4886]: I0314 09:12:05.776771 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557986-6l2sx"] Mar 14 09:12:07 crc kubenswrapper[4886]: I0314 09:12:07.434794 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10a97b1-2c18-4685-b178-d7b8dc0495b5" path="/var/lib/kubelet/pods/f10a97b1-2c18-4685-b178-d7b8dc0495b5/volumes" Mar 14 09:12:35 crc kubenswrapper[4886]: I0314 09:12:35.134161 4886 scope.go:117] "RemoveContainer" containerID="f7b82c0b133a837909b1b6d8e7597b6aa4a99099617f6893f961daf684398b58" Mar 14 09:12:56 crc kubenswrapper[4886]: I0314 09:12:56.067016 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:12:56 crc kubenswrapper[4886]: I0314 09:12:56.067940 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:13:26 crc kubenswrapper[4886]: I0314 09:13:26.066457 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:13:26 crc kubenswrapper[4886]: I0314 09:13:26.067205 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:13:51 crc kubenswrapper[4886]: I0314 09:13:51.505632 4886 generic.go:334] "Generic (PLEG): container finished" podID="a2fc60c8-e486-4550-8b18-92a57ff62194" containerID="79f528ee003a5ae0e78f088e2f81d010fc80619e8ba317a846e5616ce3bb7db6" exitCode=0 Mar 14 09:13:51 crc kubenswrapper[4886]: I0314 09:13:51.505683 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" event={"ID":"a2fc60c8-e486-4550-8b18-92a57ff62194","Type":"ContainerDied","Data":"79f528ee003a5ae0e78f088e2f81d010fc80619e8ba317a846e5616ce3bb7db6"} Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.063415 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.233912 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-0\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234043 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wl9\" (UniqueName: \"kubernetes.io/projected/a2fc60c8-e486-4550-8b18-92a57ff62194-kube-api-access-w2wl9\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234087 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-1\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234150 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-combined-ca-bundle\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234211 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-1\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234246 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-2\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234278 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-3\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234326 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-extra-config-0\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234449 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-inventory\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234497 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-ssh-key-openstack-edpm-ipam\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.234532 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-0\") pod \"a2fc60c8-e486-4550-8b18-92a57ff62194\" (UID: \"a2fc60c8-e486-4550-8b18-92a57ff62194\") " Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.244830 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.244878 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2fc60c8-e486-4550-8b18-92a57ff62194-kube-api-access-w2wl9" (OuterVolumeSpecName: "kube-api-access-w2wl9") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "kube-api-access-w2wl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.263055 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.265025 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.265724 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.266010 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.266577 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-inventory" (OuterVolumeSpecName: "inventory") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.267300 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.284477 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.285539 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.287734 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a2fc60c8-e486-4550-8b18-92a57ff62194" (UID: "a2fc60c8-e486-4550-8b18-92a57ff62194"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337470 4886 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337506 4886 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337516 4886 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337526 4886 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337535 4886 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337544 4886 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337554 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337565 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337575 4886 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337586 4886 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a2fc60c8-e486-4550-8b18-92a57ff62194-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.337594 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wl9\" (UniqueName: \"kubernetes.io/projected/a2fc60c8-e486-4550-8b18-92a57ff62194-kube-api-access-w2wl9\") on node \"crc\" DevicePath \"\"" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.529352 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" event={"ID":"a2fc60c8-e486-4550-8b18-92a57ff62194","Type":"ContainerDied","Data":"85c092d3eeebd11aab3542e963c8e93ac9b17f938a6a552837edfaf8cdc3eb68"} Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.529411 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c092d3eeebd11aab3542e963c8e93ac9b17f938a6a552837edfaf8cdc3eb68" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.529426 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dl7z2" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.652375 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42"] Mar 14 09:13:53 crc kubenswrapper[4886]: E0314 09:13:53.652869 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e881b5-4abd-4b66-90e3-b364535731ab" containerName="oc" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.652892 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e881b5-4abd-4b66-90e3-b364535731ab" containerName="oc" Mar 14 09:13:53 crc kubenswrapper[4886]: E0314 09:13:53.652928 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2fc60c8-e486-4550-8b18-92a57ff62194" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.652935 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2fc60c8-e486-4550-8b18-92a57ff62194" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.653181 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2fc60c8-e486-4550-8b18-92a57ff62194" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.653206 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e881b5-4abd-4b66-90e3-b364535731ab" containerName="oc" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.654031 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.656362 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.656482 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ftkvj" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.657332 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.657698 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.662237 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.663143 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42"] Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.849722 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.849957 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.850043 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjttm\" (UniqueName: \"kubernetes.io/projected/e115c62e-eb7f-41a9-a613-7523bcfc2e90-kube-api-access-sjttm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.850217 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.850287 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.850742 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.851008 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952618 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952694 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952715 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjttm\" (UniqueName: \"kubernetes.io/projected/e115c62e-eb7f-41a9-a613-7523bcfc2e90-kube-api-access-sjttm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952755 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952785 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952852 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.952914 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.958024 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.958074 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.958432 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.958915 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.959076 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.970438 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:53 crc kubenswrapper[4886]: I0314 09:13:53.971194 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjttm\" (UniqueName: \"kubernetes.io/projected/e115c62e-eb7f-41a9-a613-7523bcfc2e90-kube-api-access-sjttm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8bd42\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:54 crc kubenswrapper[4886]: I0314 09:13:54.270909 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:13:54 crc kubenswrapper[4886]: I0314 09:13:54.858463 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42"] Mar 14 09:13:55 crc kubenswrapper[4886]: I0314 09:13:55.553954 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" event={"ID":"e115c62e-eb7f-41a9-a613-7523bcfc2e90","Type":"ContainerStarted","Data":"32e66d3c718cff8f6c369b8840e9842a10b202aa10c6bb2af382b92296bbf627"} Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.066333 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.066401 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.066454 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.067708 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2809cdb5d4b2bdb4e85c97861df551fd5362e454f85257e7b9dae7d20dccedf"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.067782 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://a2809cdb5d4b2bdb4e85c97861df551fd5362e454f85257e7b9dae7d20dccedf" gracePeriod=600 Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.566828 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="a2809cdb5d4b2bdb4e85c97861df551fd5362e454f85257e7b9dae7d20dccedf" exitCode=0 Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.567194 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"a2809cdb5d4b2bdb4e85c97861df551fd5362e454f85257e7b9dae7d20dccedf"} Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.567223 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba"} Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.567244 4886 scope.go:117] "RemoveContainer" containerID="c22ab4b0f87f0d3f3d0e8d837e1a3d70ddf1970025af3980c4b9e103255ed551" Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.570306 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" event={"ID":"e115c62e-eb7f-41a9-a613-7523bcfc2e90","Type":"ContainerStarted","Data":"bec9f57d21b19737cebce9b53d9f3d968de6cd4e5f688e5d1d4a3ba3f531bb95"} Mar 14 09:13:56 crc kubenswrapper[4886]: I0314 09:13:56.655350 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" podStartSLOduration=3.035903944 podStartE2EDuration="3.655330898s" podCreationTimestamp="2026-03-14 09:13:53 +0000 UTC" firstStartedPulling="2026-03-14 09:13:54.86167317 +0000 UTC m=+2770.110124807" lastFinishedPulling="2026-03-14 09:13:55.481100124 +0000 UTC m=+2770.729551761" observedRunningTime="2026-03-14 09:13:56.648795912 +0000 UTC m=+2771.897247549" watchObservedRunningTime="2026-03-14 09:13:56.655330898 +0000 UTC m=+2771.903782535" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.138814 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557994-qtxzf"] Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.141517 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.145009 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.145207 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.145799 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.165204 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-qtxzf"] Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.322102 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfpc\" (UniqueName: \"kubernetes.io/projected/e9f7a6b6-9185-4d28-9187-dbbd9819ba61-kube-api-access-xxfpc\") pod \"auto-csr-approver-29557994-qtxzf\" (UID: \"e9f7a6b6-9185-4d28-9187-dbbd9819ba61\") " pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.424835 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfpc\" (UniqueName: \"kubernetes.io/projected/e9f7a6b6-9185-4d28-9187-dbbd9819ba61-kube-api-access-xxfpc\") pod \"auto-csr-approver-29557994-qtxzf\" (UID: \"e9f7a6b6-9185-4d28-9187-dbbd9819ba61\") " pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.445634 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfpc\" (UniqueName: \"kubernetes.io/projected/e9f7a6b6-9185-4d28-9187-dbbd9819ba61-kube-api-access-xxfpc\") pod \"auto-csr-approver-29557994-qtxzf\" (UID: \"e9f7a6b6-9185-4d28-9187-dbbd9819ba61\") " pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.473394 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:00 crc kubenswrapper[4886]: I0314 09:14:00.963678 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-qtxzf"] Mar 14 09:14:00 crc kubenswrapper[4886]: W0314 09:14:00.982673 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9f7a6b6_9185_4d28_9187_dbbd9819ba61.slice/crio-bbd5846445302b33c0bc0ff9fc8cfab99a46affc4501d5ec60ee7074d0075d2c WatchSource:0}: Error finding container bbd5846445302b33c0bc0ff9fc8cfab99a46affc4501d5ec60ee7074d0075d2c: Status 404 returned error can't find the container with id bbd5846445302b33c0bc0ff9fc8cfab99a46affc4501d5ec60ee7074d0075d2c Mar 14 09:14:01 crc kubenswrapper[4886]: I0314 09:14:01.629607 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" event={"ID":"e9f7a6b6-9185-4d28-9187-dbbd9819ba61","Type":"ContainerStarted","Data":"bbd5846445302b33c0bc0ff9fc8cfab99a46affc4501d5ec60ee7074d0075d2c"} Mar 14 09:14:02 crc kubenswrapper[4886]: I0314 09:14:02.655673 4886 generic.go:334] "Generic (PLEG): container finished" podID="e9f7a6b6-9185-4d28-9187-dbbd9819ba61" containerID="5d19ac2d4696fcc4a3165800bdfcd53c9ff24e366c7e18ca1ccbd4cde1119358" exitCode=0 Mar 14 09:14:02 crc kubenswrapper[4886]: I0314 09:14:02.655754 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" event={"ID":"e9f7a6b6-9185-4d28-9187-dbbd9819ba61","Type":"ContainerDied","Data":"5d19ac2d4696fcc4a3165800bdfcd53c9ff24e366c7e18ca1ccbd4cde1119358"} Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.089169 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.217956 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfpc\" (UniqueName: \"kubernetes.io/projected/e9f7a6b6-9185-4d28-9187-dbbd9819ba61-kube-api-access-xxfpc\") pod \"e9f7a6b6-9185-4d28-9187-dbbd9819ba61\" (UID: \"e9f7a6b6-9185-4d28-9187-dbbd9819ba61\") " Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.225460 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f7a6b6-9185-4d28-9187-dbbd9819ba61-kube-api-access-xxfpc" (OuterVolumeSpecName: "kube-api-access-xxfpc") pod "e9f7a6b6-9185-4d28-9187-dbbd9819ba61" (UID: "e9f7a6b6-9185-4d28-9187-dbbd9819ba61"). InnerVolumeSpecName "kube-api-access-xxfpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.320396 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxfpc\" (UniqueName: \"kubernetes.io/projected/e9f7a6b6-9185-4d28-9187-dbbd9819ba61-kube-api-access-xxfpc\") on node \"crc\" DevicePath \"\"" Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.681408 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" event={"ID":"e9f7a6b6-9185-4d28-9187-dbbd9819ba61","Type":"ContainerDied","Data":"bbd5846445302b33c0bc0ff9fc8cfab99a46affc4501d5ec60ee7074d0075d2c"} Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.681470 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbd5846445302b33c0bc0ff9fc8cfab99a46affc4501d5ec60ee7074d0075d2c" Mar 14 09:14:04 crc kubenswrapper[4886]: I0314 09:14:04.681521 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557994-qtxzf" Mar 14 09:14:05 crc kubenswrapper[4886]: I0314 09:14:05.248234 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-x7fwn"] Mar 14 09:14:05 crc kubenswrapper[4886]: I0314 09:14:05.255647 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557988-x7fwn"] Mar 14 09:14:05 crc kubenswrapper[4886]: I0314 09:14:05.432913 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93af3bcb-09d4-4b5e-9347-68dcfc6cff6c" path="/var/lib/kubelet/pods/93af3bcb-09d4-4b5e-9347-68dcfc6cff6c/volumes" Mar 14 09:14:35 crc kubenswrapper[4886]: I0314 09:14:35.276710 4886 scope.go:117] "RemoveContainer" containerID="ce8c6074c0277f4bb45c6dc56204caaed3c3a9ffad3d0f63b9747266036dd356" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.161694 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck"] Mar 14 09:15:00 crc kubenswrapper[4886]: E0314 09:15:00.163618 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f7a6b6-9185-4d28-9187-dbbd9819ba61" containerName="oc" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.163646 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f7a6b6-9185-4d28-9187-dbbd9819ba61" containerName="oc" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.164170 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f7a6b6-9185-4d28-9187-dbbd9819ba61" containerName="oc" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.165580 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.168003 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.168698 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.180342 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck"] Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.236210 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93d286d-e146-4189-8377-0b64be26ca42-config-volume\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.236258 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93d286d-e146-4189-8377-0b64be26ca42-secret-volume\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.236398 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvws8\" (UniqueName: \"kubernetes.io/projected/b93d286d-e146-4189-8377-0b64be26ca42-kube-api-access-bvws8\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.339135 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93d286d-e146-4189-8377-0b64be26ca42-config-volume\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.339208 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93d286d-e146-4189-8377-0b64be26ca42-secret-volume\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.339331 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvws8\" (UniqueName: \"kubernetes.io/projected/b93d286d-e146-4189-8377-0b64be26ca42-kube-api-access-bvws8\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.340241 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93d286d-e146-4189-8377-0b64be26ca42-config-volume\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.348426 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93d286d-e146-4189-8377-0b64be26ca42-secret-volume\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.359471 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvws8\" (UniqueName: \"kubernetes.io/projected/b93d286d-e146-4189-8377-0b64be26ca42-kube-api-access-bvws8\") pod \"collect-profiles-29557995-9qdck\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:00 crc kubenswrapper[4886]: I0314 09:15:00.519264 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:01 crc kubenswrapper[4886]: I0314 09:15:01.016836 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck"] Mar 14 09:15:01 crc kubenswrapper[4886]: I0314 09:15:01.276663 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" event={"ID":"b93d286d-e146-4189-8377-0b64be26ca42","Type":"ContainerStarted","Data":"45062aa1b54c19c086b403646a340495978d3aa9b8b4fc5552eba675d34a5210"} Mar 14 09:15:01 crc kubenswrapper[4886]: I0314 09:15:01.277131 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" event={"ID":"b93d286d-e146-4189-8377-0b64be26ca42","Type":"ContainerStarted","Data":"e990cb329be338de09f06154870338022d8653c02174c4fef1b8e12b8497a691"} Mar 14 09:15:01 crc kubenswrapper[4886]: I0314 09:15:01.335412 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" podStartSLOduration=1.335389437 podStartE2EDuration="1.335389437s" podCreationTimestamp="2026-03-14 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:15:01.311433137 +0000 UTC m=+2836.559884774" watchObservedRunningTime="2026-03-14 09:15:01.335389437 +0000 UTC m=+2836.583841074" Mar 14 09:15:02 crc kubenswrapper[4886]: I0314 09:15:02.291378 4886 generic.go:334] "Generic (PLEG): container finished" podID="b93d286d-e146-4189-8377-0b64be26ca42" containerID="45062aa1b54c19c086b403646a340495978d3aa9b8b4fc5552eba675d34a5210" exitCode=0 Mar 14 09:15:02 crc kubenswrapper[4886]: I0314 09:15:02.291435 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" event={"ID":"b93d286d-e146-4189-8377-0b64be26ca42","Type":"ContainerDied","Data":"45062aa1b54c19c086b403646a340495978d3aa9b8b4fc5552eba675d34a5210"} Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.643272 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.762300 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvws8\" (UniqueName: \"kubernetes.io/projected/b93d286d-e146-4189-8377-0b64be26ca42-kube-api-access-bvws8\") pod \"b93d286d-e146-4189-8377-0b64be26ca42\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.762479 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93d286d-e146-4189-8377-0b64be26ca42-secret-volume\") pod \"b93d286d-e146-4189-8377-0b64be26ca42\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.762625 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93d286d-e146-4189-8377-0b64be26ca42-config-volume\") pod \"b93d286d-e146-4189-8377-0b64be26ca42\" (UID: \"b93d286d-e146-4189-8377-0b64be26ca42\") " Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.763344 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93d286d-e146-4189-8377-0b64be26ca42-config-volume" (OuterVolumeSpecName: "config-volume") pod "b93d286d-e146-4189-8377-0b64be26ca42" (UID: "b93d286d-e146-4189-8377-0b64be26ca42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.768766 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93d286d-e146-4189-8377-0b64be26ca42-kube-api-access-bvws8" (OuterVolumeSpecName: "kube-api-access-bvws8") pod "b93d286d-e146-4189-8377-0b64be26ca42" (UID: "b93d286d-e146-4189-8377-0b64be26ca42"). InnerVolumeSpecName "kube-api-access-bvws8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.768905 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93d286d-e146-4189-8377-0b64be26ca42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b93d286d-e146-4189-8377-0b64be26ca42" (UID: "b93d286d-e146-4189-8377-0b64be26ca42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.865046 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93d286d-e146-4189-8377-0b64be26ca42-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.865086 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93d286d-e146-4189-8377-0b64be26ca42-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:03 crc kubenswrapper[4886]: I0314 09:15:03.865100 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvws8\" (UniqueName: \"kubernetes.io/projected/b93d286d-e146-4189-8377-0b64be26ca42-kube-api-access-bvws8\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:04 crc kubenswrapper[4886]: I0314 09:15:04.312242 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" event={"ID":"b93d286d-e146-4189-8377-0b64be26ca42","Type":"ContainerDied","Data":"e990cb329be338de09f06154870338022d8653c02174c4fef1b8e12b8497a691"} Mar 14 09:15:04 crc kubenswrapper[4886]: I0314 09:15:04.312289 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e990cb329be338de09f06154870338022d8653c02174c4fef1b8e12b8497a691" Mar 14 09:15:04 crc kubenswrapper[4886]: I0314 09:15:04.312350 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck" Mar 14 09:15:04 crc kubenswrapper[4886]: I0314 09:15:04.733440 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv"] Mar 14 09:15:04 crc kubenswrapper[4886]: I0314 09:15:04.743843 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557950-j78sv"] Mar 14 09:15:05 crc kubenswrapper[4886]: I0314 09:15:05.436084 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7ad8d4f-d958-43b7-b84d-c8672642d21b" path="/var/lib/kubelet/pods/d7ad8d4f-d958-43b7-b84d-c8672642d21b/volumes" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.010019 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5c5nf"] Mar 14 09:15:32 crc kubenswrapper[4886]: E0314 09:15:32.011641 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93d286d-e146-4189-8377-0b64be26ca42" containerName="collect-profiles" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.011663 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93d286d-e146-4189-8377-0b64be26ca42" containerName="collect-profiles" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.011954 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93d286d-e146-4189-8377-0b64be26ca42" containerName="collect-profiles" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.014260 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.034985 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5c5nf"] Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.151378 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-utilities\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.151457 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwzg6\" (UniqueName: \"kubernetes.io/projected/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-kube-api-access-hwzg6\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.151520 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-catalog-content\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.254417 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-utilities\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.254486 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwzg6\" (UniqueName: \"kubernetes.io/projected/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-kube-api-access-hwzg6\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.254520 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-catalog-content\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.255027 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-utilities\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.255040 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-catalog-content\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.278026 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwzg6\" (UniqueName: \"kubernetes.io/projected/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-kube-api-access-hwzg6\") pod \"certified-operators-5c5nf\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.340593 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:32 crc kubenswrapper[4886]: I0314 09:15:32.903239 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5c5nf"] Mar 14 09:15:33 crc kubenswrapper[4886]: I0314 09:15:33.661284 4886 generic.go:334] "Generic (PLEG): container finished" podID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerID="f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589" exitCode=0 Mar 14 09:15:33 crc kubenswrapper[4886]: I0314 09:15:33.661362 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerDied","Data":"f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589"} Mar 14 09:15:33 crc kubenswrapper[4886]: I0314 09:15:33.661589 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerStarted","Data":"8537e52ba48058c88b5c55be78a03b16a6dfcc3ac23d74e2f512fd349495a1ca"} Mar 14 09:15:33 crc kubenswrapper[4886]: I0314 09:15:33.664895 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:15:34 crc kubenswrapper[4886]: I0314 09:15:34.672495 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerStarted","Data":"382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0"} Mar 14 09:15:35 crc kubenswrapper[4886]: I0314 09:15:35.410263 4886 scope.go:117] "RemoveContainer" containerID="39570c129f1650fa07b216c5e683d1c44ac48e12801ca1f49ef7eb2a7fe8a733" Mar 14 09:15:35 crc kubenswrapper[4886]: I0314 09:15:35.686355 4886 generic.go:334] "Generic (PLEG): container finished" podID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerID="382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0" exitCode=0 Mar 14 09:15:35 crc kubenswrapper[4886]: I0314 09:15:35.686403 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerDied","Data":"382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0"} Mar 14 09:15:36 crc kubenswrapper[4886]: I0314 09:15:36.702275 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerStarted","Data":"f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4"} Mar 14 09:15:36 crc kubenswrapper[4886]: I0314 09:15:36.741571 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5c5nf" podStartSLOduration=3.294582612 podStartE2EDuration="5.741538235s" podCreationTimestamp="2026-03-14 09:15:31 +0000 UTC" firstStartedPulling="2026-03-14 09:15:33.664651641 +0000 UTC m=+2868.913103278" lastFinishedPulling="2026-03-14 09:15:36.111607264 +0000 UTC m=+2871.360058901" observedRunningTime="2026-03-14 09:15:36.729868004 +0000 UTC m=+2871.978319661" watchObservedRunningTime="2026-03-14 09:15:36.741538235 +0000 UTC m=+2871.989989882" Mar 14 09:15:42 crc kubenswrapper[4886]: I0314 09:15:42.341521 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:42 crc kubenswrapper[4886]: I0314 09:15:42.342073 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:42 crc kubenswrapper[4886]: I0314 09:15:42.408610 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:42 crc kubenswrapper[4886]: I0314 09:15:42.817288 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:42 crc kubenswrapper[4886]: I0314 09:15:42.869090 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5c5nf"] Mar 14 09:15:44 crc kubenswrapper[4886]: I0314 09:15:44.790449 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5c5nf" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="registry-server" containerID="cri-o://f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4" gracePeriod=2 Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.405152 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.573793 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-utilities\") pod \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.573952 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwzg6\" (UniqueName: \"kubernetes.io/projected/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-kube-api-access-hwzg6\") pod \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.574166 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-catalog-content\") pod \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\" (UID: \"3d8b28b4-fe86-4c26-82fe-f14d62140e4c\") " Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.575340 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-utilities" (OuterVolumeSpecName: "utilities") pod "3d8b28b4-fe86-4c26-82fe-f14d62140e4c" (UID: "3d8b28b4-fe86-4c26-82fe-f14d62140e4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.575777 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.583379 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-kube-api-access-hwzg6" (OuterVolumeSpecName: "kube-api-access-hwzg6") pod "3d8b28b4-fe86-4c26-82fe-f14d62140e4c" (UID: "3d8b28b4-fe86-4c26-82fe-f14d62140e4c"). InnerVolumeSpecName "kube-api-access-hwzg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.639243 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d8b28b4-fe86-4c26-82fe-f14d62140e4c" (UID: "3d8b28b4-fe86-4c26-82fe-f14d62140e4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.678296 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwzg6\" (UniqueName: \"kubernetes.io/projected/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-kube-api-access-hwzg6\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.678352 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d8b28b4-fe86-4c26-82fe-f14d62140e4c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.803862 4886 generic.go:334] "Generic (PLEG): container finished" podID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerID="f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4" exitCode=0 Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.803906 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerDied","Data":"f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4"} Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.803932 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5c5nf" event={"ID":"3d8b28b4-fe86-4c26-82fe-f14d62140e4c","Type":"ContainerDied","Data":"8537e52ba48058c88b5c55be78a03b16a6dfcc3ac23d74e2f512fd349495a1ca"} Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.803937 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5c5nf" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.803956 4886 scope.go:117] "RemoveContainer" containerID="f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.842557 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5c5nf"] Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.842699 4886 scope.go:117] "RemoveContainer" containerID="382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.854164 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5c5nf"] Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.870106 4886 scope.go:117] "RemoveContainer" containerID="f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.920784 4886 scope.go:117] "RemoveContainer" containerID="f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4" Mar 14 09:15:45 crc kubenswrapper[4886]: E0314 09:15:45.921271 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4\": container with ID starting with f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4 not found: ID does not exist" containerID="f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.921316 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4"} err="failed to get container status \"f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4\": rpc error: code = NotFound desc = could not find container \"f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4\": container with ID starting with f809d0be2f4bbf076f19df8b0640bcaec1e028012d98473e59b6c59617b076a4 not found: ID does not exist" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.921339 4886 scope.go:117] "RemoveContainer" containerID="382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0" Mar 14 09:15:45 crc kubenswrapper[4886]: E0314 09:15:45.921650 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0\": container with ID starting with 382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0 not found: ID does not exist" containerID="382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.921735 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0"} err="failed to get container status \"382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0\": rpc error: code = NotFound desc = could not find container \"382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0\": container with ID starting with 382cd2129a933ce1c01136961c7fdc623f770c0503cd57d1954cd8920efec7b0 not found: ID does not exist" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.921787 4886 scope.go:117] "RemoveContainer" containerID="f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589" Mar 14 09:15:45 crc kubenswrapper[4886]: E0314 09:15:45.922080 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589\": container with ID starting with f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589 not found: ID does not exist" containerID="f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589" Mar 14 09:15:45 crc kubenswrapper[4886]: I0314 09:15:45.922105 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589"} err="failed to get container status \"f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589\": rpc error: code = NotFound desc = could not find container \"f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589\": container with ID starting with f17656520881642a3eefbf315ddbb0772ab3c137e266fc3066d34bb8e4c17589 not found: ID does not exist" Mar 14 09:15:47 crc kubenswrapper[4886]: I0314 09:15:47.434216 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" path="/var/lib/kubelet/pods/3d8b28b4-fe86-4c26-82fe-f14d62140e4c/volumes" Mar 14 09:15:56 crc kubenswrapper[4886]: I0314 09:15:56.066226 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:15:56 crc kubenswrapper[4886]: I0314 09:15:56.066987 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.150303 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557996-7cp2k"] Mar 14 09:16:00 crc kubenswrapper[4886]: E0314 09:16:00.151231 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="extract-content" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.151244 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="extract-content" Mar 14 09:16:00 crc kubenswrapper[4886]: E0314 09:16:00.151276 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="extract-utilities" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.151282 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="extract-utilities" Mar 14 09:16:00 crc kubenswrapper[4886]: E0314 09:16:00.151307 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="registry-server" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.151315 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="registry-server" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.151525 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d8b28b4-fe86-4c26-82fe-f14d62140e4c" containerName="registry-server" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.152319 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.155350 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.155614 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.159647 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-7cp2k"] Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.161141 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.178819 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgcrb\" (UniqueName: \"kubernetes.io/projected/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae-kube-api-access-mgcrb\") pod \"auto-csr-approver-29557996-7cp2k\" (UID: \"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae\") " pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.280755 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgcrb\" (UniqueName: \"kubernetes.io/projected/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae-kube-api-access-mgcrb\") pod \"auto-csr-approver-29557996-7cp2k\" (UID: \"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae\") " pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.303136 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgcrb\" (UniqueName: \"kubernetes.io/projected/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae-kube-api-access-mgcrb\") pod \"auto-csr-approver-29557996-7cp2k\" (UID: \"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae\") " pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.469256 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.924907 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-7cp2k"] Mar 14 09:16:00 crc kubenswrapper[4886]: I0314 09:16:00.959853 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" event={"ID":"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae","Type":"ContainerStarted","Data":"4f14b4985271145c68a79a4816ac71c79123e5d60aacd6c8283774ad47d45f00"} Mar 14 09:16:02 crc kubenswrapper[4886]: I0314 09:16:02.984789 4886 generic.go:334] "Generic (PLEG): container finished" podID="fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae" containerID="a3f27af38b9cecc66d079900902015352c8657e68c43fd7eceba26903b7b9d6f" exitCode=0 Mar 14 09:16:02 crc kubenswrapper[4886]: I0314 09:16:02.984905 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" event={"ID":"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae","Type":"ContainerDied","Data":"a3f27af38b9cecc66d079900902015352c8657e68c43fd7eceba26903b7b9d6f"} Mar 14 09:16:04 crc kubenswrapper[4886]: I0314 09:16:04.427721 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:04 crc kubenswrapper[4886]: I0314 09:16:04.591297 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgcrb\" (UniqueName: \"kubernetes.io/projected/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae-kube-api-access-mgcrb\") pod \"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae\" (UID: \"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae\") " Mar 14 09:16:04 crc kubenswrapper[4886]: I0314 09:16:04.598912 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae-kube-api-access-mgcrb" (OuterVolumeSpecName: "kube-api-access-mgcrb") pod "fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae" (UID: "fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae"). InnerVolumeSpecName "kube-api-access-mgcrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:04 crc kubenswrapper[4886]: I0314 09:16:04.693808 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgcrb\" (UniqueName: \"kubernetes.io/projected/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae-kube-api-access-mgcrb\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.015261 4886 generic.go:334] "Generic (PLEG): container finished" podID="e115c62e-eb7f-41a9-a613-7523bcfc2e90" containerID="bec9f57d21b19737cebce9b53d9f3d968de6cd4e5f688e5d1d4a3ba3f531bb95" exitCode=0 Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.015354 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" event={"ID":"e115c62e-eb7f-41a9-a613-7523bcfc2e90","Type":"ContainerDied","Data":"bec9f57d21b19737cebce9b53d9f3d968de6cd4e5f688e5d1d4a3ba3f531bb95"} Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.018507 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" event={"ID":"fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae","Type":"ContainerDied","Data":"4f14b4985271145c68a79a4816ac71c79123e5d60aacd6c8283774ad47d45f00"} Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.018545 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f14b4985271145c68a79a4816ac71c79123e5d60aacd6c8283774ad47d45f00" Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.018553 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557996-7cp2k" Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.510786 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-vd8jl"] Mar 14 09:16:05 crc kubenswrapper[4886]: I0314 09:16:05.521448 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557990-vd8jl"] Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.474061 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.539041 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-2\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.569327 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.641126 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-1\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.641634 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-inventory\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.642184 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjttm\" (UniqueName: \"kubernetes.io/projected/e115c62e-eb7f-41a9-a613-7523bcfc2e90-kube-api-access-sjttm\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.642414 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-0\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.642640 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-telemetry-combined-ca-bundle\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.642752 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ssh-key-openstack-edpm-ipam\") pod \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\" (UID: \"e115c62e-eb7f-41a9-a613-7523bcfc2e90\") " Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.643803 4886 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.645306 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e115c62e-eb7f-41a9-a613-7523bcfc2e90-kube-api-access-sjttm" (OuterVolumeSpecName: "kube-api-access-sjttm") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "kube-api-access-sjttm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.646070 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.668403 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.670960 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.674677 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-inventory" (OuterVolumeSpecName: "inventory") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.679471 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "e115c62e-eb7f-41a9-a613-7523bcfc2e90" (UID: "e115c62e-eb7f-41a9-a613-7523bcfc2e90"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.745282 4886 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-inventory\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.745314 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjttm\" (UniqueName: \"kubernetes.io/projected/e115c62e-eb7f-41a9-a613-7523bcfc2e90-kube-api-access-sjttm\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.745328 4886 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.745339 4886 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.745349 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:06 crc kubenswrapper[4886]: I0314 09:16:06.745358 4886 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e115c62e-eb7f-41a9-a613-7523bcfc2e90-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:07 crc kubenswrapper[4886]: I0314 09:16:07.041798 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" event={"ID":"e115c62e-eb7f-41a9-a613-7523bcfc2e90","Type":"ContainerDied","Data":"32e66d3c718cff8f6c369b8840e9842a10b202aa10c6bb2af382b92296bbf627"} Mar 14 09:16:07 crc kubenswrapper[4886]: I0314 09:16:07.041848 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e66d3c718cff8f6c369b8840e9842a10b202aa10c6bb2af382b92296bbf627" Mar 14 09:16:07 crc kubenswrapper[4886]: I0314 09:16:07.041916 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8bd42" Mar 14 09:16:07 crc kubenswrapper[4886]: I0314 09:16:07.434544 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae09f2e7-cd2e-4c5e-80dc-350251531ac0" path="/var/lib/kubelet/pods/ae09f2e7-cd2e-4c5e-80dc-350251531ac0/volumes" Mar 14 09:16:26 crc kubenswrapper[4886]: I0314 09:16:26.067112 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:16:26 crc kubenswrapper[4886]: I0314 09:16:26.067798 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:16:35 crc kubenswrapper[4886]: I0314 09:16:35.533313 4886 scope.go:117] "RemoveContainer" containerID="0f0fb71d276bd706afd34ce15c971645e3d2c6b27e18289bad791da7d3187ad1" Mar 14 09:16:48 crc kubenswrapper[4886]: I0314 09:16:48.825547 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:16:48 crc kubenswrapper[4886]: I0314 09:16:48.826324 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="prometheus" containerID="cri-o://38573258377a509e0a387421c64b2d3afe8291ce8182cc5132e20d24ba301499" gracePeriod=600 Mar 14 09:16:48 crc kubenswrapper[4886]: I0314 09:16:48.826440 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="thanos-sidecar" containerID="cri-o://7395d1173216948ed510ba033503e5c5f55b3f699fdca45e95a301c0f6e67729" gracePeriod=600 Mar 14 09:16:48 crc kubenswrapper[4886]: I0314 09:16:48.826484 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="config-reloader" containerID="cri-o://bb678afc7bfbc641254b64f4aa04b60a3365d4a86fac8399cf5e5da745578fdb" gracePeriod=600 Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.529420 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerID="7395d1173216948ed510ba033503e5c5f55b3f699fdca45e95a301c0f6e67729" exitCode=0 Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.529933 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerID="bb678afc7bfbc641254b64f4aa04b60a3365d4a86fac8399cf5e5da745578fdb" exitCode=0 Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.529947 4886 generic.go:334] "Generic (PLEG): container finished" podID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerID="38573258377a509e0a387421c64b2d3afe8291ce8182cc5132e20d24ba301499" exitCode=0 Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.529537 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerDied","Data":"7395d1173216948ed510ba033503e5c5f55b3f699fdca45e95a301c0f6e67729"} Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.529988 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerDied","Data":"bb678afc7bfbc641254b64f4aa04b60a3365d4a86fac8399cf5e5da745578fdb"} Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.530005 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerDied","Data":"38573258377a509e0a387421c64b2d3afe8291ce8182cc5132e20d24ba301499"} Mar 14 09:16:49 crc kubenswrapper[4886]: I0314 09:16:49.934580 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000555 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-tls-assets\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000675 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000802 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000836 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config-out\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000867 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-thanos-prometheus-http-client-file\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000897 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-2\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000918 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-secret-combined-ca-bundle\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.000947 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.001096 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.001233 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.001307 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-1\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.001348 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6tkf\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-kube-api-access-n6tkf\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.001383 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-0\") pod \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\" (UID: \"cf4e51e9-c5ec-41ee-83f5-3b031c20c877\") " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.001846 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.002195 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.002348 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.007555 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config-out" (OuterVolumeSpecName: "config-out") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.007621 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.007652 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-kube-api-access-n6tkf" (OuterVolumeSpecName: "kube-api-access-n6tkf") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "kube-api-access-n6tkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.007654 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config" (OuterVolumeSpecName: "config") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.009275 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.011281 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.013495 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.026415 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.042568 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.094307 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config" (OuterVolumeSpecName: "web-config") pod "cf4e51e9-c5ec-41ee-83f5-3b031c20c877" (UID: "cf4e51e9-c5ec-41ee-83f5-3b031c20c877"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104500 4886 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104547 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104562 4886 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104575 4886 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104623 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") on node \"crc\" " Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104640 4886 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104655 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104668 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6tkf\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-kube-api-access-n6tkf\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104679 4886 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104693 4886 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104706 4886 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104720 4886 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.104733 4886 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf4e51e9-c5ec-41ee-83f5-3b031c20c877-config-out\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.128397 4886 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.128537 4886 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069") on node "crc" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.206444 4886 reconciler_common.go:293] "Volume detached for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") on node \"crc\" DevicePath \"\"" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.542869 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf4e51e9-c5ec-41ee-83f5-3b031c20c877","Type":"ContainerDied","Data":"ae2fe54ce8b01bf34b65779aef018ce89a7d8c3f33690c1e8a216d329447c737"} Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.543340 4886 scope.go:117] "RemoveContainer" containerID="7395d1173216948ed510ba033503e5c5f55b3f699fdca45e95a301c0f6e67729" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.543007 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.570992 4886 scope.go:117] "RemoveContainer" containerID="bb678afc7bfbc641254b64f4aa04b60a3365d4a86fac8399cf5e5da745578fdb" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.596929 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.603226 4886 scope.go:117] "RemoveContainer" containerID="38573258377a509e0a387421c64b2d3afe8291ce8182cc5132e20d24ba301499" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.608650 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.630533 4886 scope.go:117] "RemoveContainer" containerID="ca53866a18c29a479218672cd44e44157f683bc6a1b3b954cf62e976e738cfd3" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.636336 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:16:50 crc kubenswrapper[4886]: E0314 09:16:50.638705 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae" containerName="oc" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.638734 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae" containerName="oc" Mar 14 09:16:50 crc kubenswrapper[4886]: E0314 09:16:50.638750 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="init-config-reloader" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.638758 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="init-config-reloader" Mar 14 09:16:50 crc kubenswrapper[4886]: E0314 09:16:50.638769 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115c62e-eb7f-41a9-a613-7523bcfc2e90" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.638779 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115c62e-eb7f-41a9-a613-7523bcfc2e90" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 14 09:16:50 crc kubenswrapper[4886]: E0314 09:16:50.638809 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="config-reloader" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.638817 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="config-reloader" Mar 14 09:16:50 crc kubenswrapper[4886]: E0314 09:16:50.638834 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="thanos-sidecar" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.638841 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="thanos-sidecar" Mar 14 09:16:50 crc kubenswrapper[4886]: E0314 09:16:50.638876 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="prometheus" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.638882 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="prometheus" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.639784 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115c62e-eb7f-41a9-a613-7523bcfc2e90" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.639808 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae" containerName="oc" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.639837 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="config-reloader" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.639872 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="prometheus" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.639890 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" containerName="thanos-sidecar" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.642639 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.647704 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.647952 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.648085 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.648318 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nz9cv" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.648464 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.648956 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.650059 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.650748 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.668671 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716697 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d632d59-7754-43d9-9a6a-1e818a26a715-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716757 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716785 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716812 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d632d59-7754-43d9-9a6a-1e818a26a715-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716860 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-config\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716880 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716932 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716950 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.716973 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.717003 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxbb\" (UniqueName: \"kubernetes.io/projected/6d632d59-7754-43d9-9a6a-1e818a26a715-kube-api-access-4mxbb\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.717043 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.717072 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.717103 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819043 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxbb\" (UniqueName: \"kubernetes.io/projected/6d632d59-7754-43d9-9a6a-1e818a26a715-kube-api-access-4mxbb\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819218 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819291 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819345 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819454 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d632d59-7754-43d9-9a6a-1e818a26a715-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819500 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819530 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819555 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d632d59-7754-43d9-9a6a-1e818a26a715-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819655 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-config\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819685 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819750 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819780 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819832 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.819953 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.820760 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.821867 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6d632d59-7754-43d9-9a6a-1e818a26a715-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.823707 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.824017 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d632d59-7754-43d9-9a6a-1e818a26a715-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.824558 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.824854 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-config\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.825188 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.830206 4886 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.830527 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/915b9721c137ba3e3acc5e7d0fcf048ab5161bf9eea8563b49a63a650ee09ff7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.830204 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.830604 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d632d59-7754-43d9-9a6a-1e818a26a715-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.830436 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d632d59-7754-43d9-9a6a-1e818a26a715-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.838116 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxbb\" (UniqueName: \"kubernetes.io/projected/6d632d59-7754-43d9-9a6a-1e818a26a715-kube-api-access-4mxbb\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:50 crc kubenswrapper[4886]: I0314 09:16:50.866201 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a860c072-6b7b-4ee3-b876-fe8f34e70069\") pod \"prometheus-metric-storage-0\" (UID: \"6d632d59-7754-43d9-9a6a-1e818a26a715\") " pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:51 crc kubenswrapper[4886]: I0314 09:16:51.049175 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 14 09:16:51 crc kubenswrapper[4886]: I0314 09:16:51.438530 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4e51e9-c5ec-41ee-83f5-3b031c20c877" path="/var/lib/kubelet/pods/cf4e51e9-c5ec-41ee-83f5-3b031c20c877/volumes" Mar 14 09:16:51 crc kubenswrapper[4886]: I0314 09:16:51.532109 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 14 09:16:51 crc kubenswrapper[4886]: I0314 09:16:51.551164 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6d632d59-7754-43d9-9a6a-1e818a26a715","Type":"ContainerStarted","Data":"bb2c8a3bce71861dcf1bcc70f4b1e475cf92f7a49cbf9c9efee6aff7316d8220"} Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.527926 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s8vk7"] Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.530432 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.552774 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8vk7"] Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.564358 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx89v\" (UniqueName: \"kubernetes.io/projected/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-kube-api-access-jx89v\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.564748 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-catalog-content\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.564848 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-utilities\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.667230 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx89v\" (UniqueName: \"kubernetes.io/projected/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-kube-api-access-jx89v\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.667281 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-catalog-content\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.667316 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-utilities\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.667891 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-catalog-content\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.668573 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-utilities\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.685488 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx89v\" (UniqueName: \"kubernetes.io/projected/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-kube-api-access-jx89v\") pod \"redhat-marketplace-s8vk7\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:52 crc kubenswrapper[4886]: I0314 09:16:52.854042 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:16:53 crc kubenswrapper[4886]: I0314 09:16:53.416510 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8vk7"] Mar 14 09:16:53 crc kubenswrapper[4886]: W0314 09:16:53.480420 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5a14e8_e418_4da9_b1be_ab8f4e69716c.slice/crio-810dc1e20df06901f80955dd7230fcb5cc5dbfdf9cb4d5dc7710a57a05f37e91 WatchSource:0}: Error finding container 810dc1e20df06901f80955dd7230fcb5cc5dbfdf9cb4d5dc7710a57a05f37e91: Status 404 returned error can't find the container with id 810dc1e20df06901f80955dd7230fcb5cc5dbfdf9cb4d5dc7710a57a05f37e91 Mar 14 09:16:53 crc kubenswrapper[4886]: I0314 09:16:53.578190 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerStarted","Data":"810dc1e20df06901f80955dd7230fcb5cc5dbfdf9cb4d5dc7710a57a05f37e91"} Mar 14 09:16:54 crc kubenswrapper[4886]: I0314 09:16:54.593437 4886 generic.go:334] "Generic (PLEG): container finished" podID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerID="ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949" exitCode=0 Mar 14 09:16:54 crc kubenswrapper[4886]: I0314 09:16:54.593501 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerDied","Data":"ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949"} Mar 14 09:16:55 crc kubenswrapper[4886]: I0314 09:16:55.604914 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6d632d59-7754-43d9-9a6a-1e818a26a715","Type":"ContainerStarted","Data":"dc27f665560b1e724e6787f33a902a29995b0c9d38beb53d0c0bb6b50aad849c"} Mar 14 09:16:55 crc kubenswrapper[4886]: I0314 09:16:55.607497 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerStarted","Data":"ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28"} Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.066729 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.066800 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.066849 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.067711 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.067793 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" gracePeriod=600 Mar 14 09:16:56 crc kubenswrapper[4886]: E0314 09:16:56.191101 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.622355 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" exitCode=0 Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.622434 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba"} Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.622727 4886 scope.go:117] "RemoveContainer" containerID="a2809cdb5d4b2bdb4e85c97861df551fd5362e454f85257e7b9dae7d20dccedf" Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.623552 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:16:56 crc kubenswrapper[4886]: E0314 09:16:56.623849 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.626901 4886 generic.go:334] "Generic (PLEG): container finished" podID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerID="ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28" exitCode=0 Mar 14 09:16:56 crc kubenswrapper[4886]: I0314 09:16:56.626965 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerDied","Data":"ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28"} Mar 14 09:16:57 crc kubenswrapper[4886]: I0314 09:16:57.639076 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerStarted","Data":"8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573"} Mar 14 09:16:57 crc kubenswrapper[4886]: I0314 09:16:57.664501 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s8vk7" podStartSLOduration=3.188037646 podStartE2EDuration="5.66447185s" podCreationTimestamp="2026-03-14 09:16:52 +0000 UTC" firstStartedPulling="2026-03-14 09:16:54.595395518 +0000 UTC m=+2949.843847175" lastFinishedPulling="2026-03-14 09:16:57.071829742 +0000 UTC m=+2952.320281379" observedRunningTime="2026-03-14 09:16:57.656800642 +0000 UTC m=+2952.905252289" watchObservedRunningTime="2026-03-14 09:16:57.66447185 +0000 UTC m=+2952.912923487" Mar 14 09:17:02 crc kubenswrapper[4886]: I0314 09:17:02.854215 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:17:02 crc kubenswrapper[4886]: I0314 09:17:02.854768 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:17:02 crc kubenswrapper[4886]: I0314 09:17:02.905918 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:17:03 crc kubenswrapper[4886]: I0314 09:17:03.706770 4886 generic.go:334] "Generic (PLEG): container finished" podID="6d632d59-7754-43d9-9a6a-1e818a26a715" containerID="dc27f665560b1e724e6787f33a902a29995b0c9d38beb53d0c0bb6b50aad849c" exitCode=0 Mar 14 09:17:03 crc kubenswrapper[4886]: I0314 09:17:03.706884 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6d632d59-7754-43d9-9a6a-1e818a26a715","Type":"ContainerDied","Data":"dc27f665560b1e724e6787f33a902a29995b0c9d38beb53d0c0bb6b50aad849c"} Mar 14 09:17:03 crc kubenswrapper[4886]: I0314 09:17:03.786751 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:17:03 crc kubenswrapper[4886]: I0314 09:17:03.889539 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8vk7"] Mar 14 09:17:04 crc kubenswrapper[4886]: I0314 09:17:04.727175 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6d632d59-7754-43d9-9a6a-1e818a26a715","Type":"ContainerStarted","Data":"8283b41de23377d0ac26049684bf88b13d36bb50b395028552702f3463e385df"} Mar 14 09:17:05 crc kubenswrapper[4886]: I0314 09:17:05.739657 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s8vk7" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="registry-server" containerID="cri-o://8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573" gracePeriod=2 Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.545138 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.634396 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-utilities\") pod \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.634838 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-catalog-content\") pod \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.635010 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx89v\" (UniqueName: \"kubernetes.io/projected/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-kube-api-access-jx89v\") pod \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\" (UID: \"3c5a14e8-e418-4da9-b1be-ab8f4e69716c\") " Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.635377 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-utilities" (OuterVolumeSpecName: "utilities") pod "3c5a14e8-e418-4da9-b1be-ab8f4e69716c" (UID: "3c5a14e8-e418-4da9-b1be-ab8f4e69716c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.635838 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.681787 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-kube-api-access-jx89v" (OuterVolumeSpecName: "kube-api-access-jx89v") pod "3c5a14e8-e418-4da9-b1be-ab8f4e69716c" (UID: "3c5a14e8-e418-4da9-b1be-ab8f4e69716c"). InnerVolumeSpecName "kube-api-access-jx89v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.682776 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c5a14e8-e418-4da9-b1be-ab8f4e69716c" (UID: "3c5a14e8-e418-4da9-b1be-ab8f4e69716c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.738157 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.738438 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx89v\" (UniqueName: \"kubernetes.io/projected/3c5a14e8-e418-4da9-b1be-ab8f4e69716c-kube-api-access-jx89v\") on node \"crc\" DevicePath \"\"" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.753295 4886 generic.go:334] "Generic (PLEG): container finished" podID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerID="8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573" exitCode=0 Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.753348 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerDied","Data":"8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573"} Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.753378 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8vk7" event={"ID":"3c5a14e8-e418-4da9-b1be-ab8f4e69716c","Type":"ContainerDied","Data":"810dc1e20df06901f80955dd7230fcb5cc5dbfdf9cb4d5dc7710a57a05f37e91"} Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.753391 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8vk7" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.753401 4886 scope.go:117] "RemoveContainer" containerID="8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.781865 4886 scope.go:117] "RemoveContainer" containerID="ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.800031 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8vk7"] Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.811734 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8vk7"] Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.813032 4886 scope.go:117] "RemoveContainer" containerID="ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.855617 4886 scope.go:117] "RemoveContainer" containerID="8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573" Mar 14 09:17:06 crc kubenswrapper[4886]: E0314 09:17:06.856232 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573\": container with ID starting with 8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573 not found: ID does not exist" containerID="8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.856375 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573"} err="failed to get container status \"8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573\": rpc error: code = NotFound desc = could not find container \"8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573\": container with ID starting with 8a13a0b3c2e2ecac03f889abcb6f13f21fb105c205057cd5cf1777e081a09573 not found: ID does not exist" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.856477 4886 scope.go:117] "RemoveContainer" containerID="ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28" Mar 14 09:17:06 crc kubenswrapper[4886]: E0314 09:17:06.857215 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28\": container with ID starting with ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28 not found: ID does not exist" containerID="ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.857252 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28"} err="failed to get container status \"ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28\": rpc error: code = NotFound desc = could not find container \"ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28\": container with ID starting with ed2ec8015b5cc1e9393ae078a8ae63f314793b10f88891788cf08d627b483b28 not found: ID does not exist" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.857279 4886 scope.go:117] "RemoveContainer" containerID="ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949" Mar 14 09:17:06 crc kubenswrapper[4886]: E0314 09:17:06.857724 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949\": container with ID starting with ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949 not found: ID does not exist" containerID="ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949" Mar 14 09:17:06 crc kubenswrapper[4886]: I0314 09:17:06.857754 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949"} err="failed to get container status \"ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949\": rpc error: code = NotFound desc = could not find container \"ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949\": container with ID starting with ff147e249f46c2787fa16308592ead512ae5441b0894a504d384f36711838949 not found: ID does not exist" Mar 14 09:17:07 crc kubenswrapper[4886]: I0314 09:17:07.436622 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" path="/var/lib/kubelet/pods/3c5a14e8-e418-4da9-b1be-ab8f4e69716c/volumes" Mar 14 09:17:08 crc kubenswrapper[4886]: I0314 09:17:08.779609 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6d632d59-7754-43d9-9a6a-1e818a26a715","Type":"ContainerStarted","Data":"e3047a66490d06f61423d5b34f66f2823a18ed68df1f7a0e063985a5e74900ce"} Mar 14 09:17:08 crc kubenswrapper[4886]: I0314 09:17:08.781075 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6d632d59-7754-43d9-9a6a-1e818a26a715","Type":"ContainerStarted","Data":"e1642ea87b093e954cac0539eb0e8e7760a5c1fd97d47c9d2d210b252f949a42"} Mar 14 09:17:08 crc kubenswrapper[4886]: I0314 09:17:08.808758 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.808740422 podStartE2EDuration="18.808740422s" podCreationTimestamp="2026-03-14 09:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:17:08.802139294 +0000 UTC m=+2964.050590931" watchObservedRunningTime="2026-03-14 09:17:08.808740422 +0000 UTC m=+2964.057192059" Mar 14 09:17:11 crc kubenswrapper[4886]: I0314 09:17:11.050064 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 14 09:17:11 crc kubenswrapper[4886]: I0314 09:17:11.421618 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:17:11 crc kubenswrapper[4886]: E0314 09:17:11.422075 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:17:21 crc kubenswrapper[4886]: I0314 09:17:21.049975 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 14 09:17:21 crc kubenswrapper[4886]: I0314 09:17:21.055481 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 14 09:17:21 crc kubenswrapper[4886]: I0314 09:17:21.945045 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 14 09:17:26 crc kubenswrapper[4886]: I0314 09:17:26.441898 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:17:26 crc kubenswrapper[4886]: E0314 09:17:26.442673 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:17:37 crc kubenswrapper[4886]: I0314 09:17:37.420229 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:17:37 crc kubenswrapper[4886]: E0314 09:17:37.421796 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.682899 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 09:17:41 crc kubenswrapper[4886]: E0314 09:17:41.684018 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="extract-content" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.684037 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="extract-content" Mar 14 09:17:41 crc kubenswrapper[4886]: E0314 09:17:41.684060 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="registry-server" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.684070 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="registry-server" Mar 14 09:17:41 crc kubenswrapper[4886]: E0314 09:17:41.684103 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="extract-utilities" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.684140 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="extract-utilities" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.684570 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a14e8-e418-4da9-b1be-ab8f4e69716c" containerName="registry-server" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.685594 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.689461 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.689723 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.689934 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-482rf" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.690149 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.694744 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.810884 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.810928 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctpv\" (UniqueName: \"kubernetes.io/projected/c4cedac0-b804-4ea3-b548-f2871b24d70a-kube-api-access-wctpv\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.810953 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-config-data\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.811259 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.811409 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.811455 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.811542 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.811593 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.811654 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.914710 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.914765 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.914806 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.914837 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.914937 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.915062 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.915093 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wctpv\" (UniqueName: \"kubernetes.io/projected/c4cedac0-b804-4ea3-b548-f2871b24d70a-kube-api-access-wctpv\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.915151 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-config-data\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.915202 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.917839 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.918094 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.918447 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.919023 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-config-data\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.919253 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.922203 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.922787 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.929678 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.943978 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wctpv\" (UniqueName: \"kubernetes.io/projected/c4cedac0-b804-4ea3-b548-f2871b24d70a-kube-api-access-wctpv\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:41 crc kubenswrapper[4886]: I0314 09:17:41.950909 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " pod="openstack/tempest-tests-tempest" Mar 14 09:17:42 crc kubenswrapper[4886]: I0314 09:17:42.011550 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 09:17:42 crc kubenswrapper[4886]: I0314 09:17:42.472808 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 14 09:17:43 crc kubenswrapper[4886]: I0314 09:17:43.164068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c4cedac0-b804-4ea3-b548-f2871b24d70a","Type":"ContainerStarted","Data":"c650eabf9ea79bf5035b4105778bd01aa0ce219c69691ec3196ca4f7cb09f4c5"} Mar 14 09:17:51 crc kubenswrapper[4886]: I0314 09:17:51.420605 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:17:51 crc kubenswrapper[4886]: E0314 09:17:51.421487 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:17:52 crc kubenswrapper[4886]: I0314 09:17:52.092467 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 14 09:17:53 crc kubenswrapper[4886]: I0314 09:17:53.294115 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c4cedac0-b804-4ea3-b548-f2871b24d70a","Type":"ContainerStarted","Data":"7bee4e63bd472f1fe2446c03706d643787848d50a8b39038a6e614a5cf0639a6"} Mar 14 09:17:53 crc kubenswrapper[4886]: I0314 09:17:53.315274 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.70451349 podStartE2EDuration="13.315254934s" podCreationTimestamp="2026-03-14 09:17:40 +0000 UTC" firstStartedPulling="2026-03-14 09:17:42.478778449 +0000 UTC m=+2997.727230086" lastFinishedPulling="2026-03-14 09:17:52.089519893 +0000 UTC m=+3007.337971530" observedRunningTime="2026-03-14 09:17:53.309682215 +0000 UTC m=+3008.558133862" watchObservedRunningTime="2026-03-14 09:17:53.315254934 +0000 UTC m=+3008.563706571" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.171541 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557998-68vzq"] Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.175136 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.178666 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.179467 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.180754 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.200688 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-68vzq"] Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.330423 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj6nb\" (UniqueName: \"kubernetes.io/projected/bf1dc598-0f17-412b-a931-b2b36433ac86-kube-api-access-rj6nb\") pod \"auto-csr-approver-29557998-68vzq\" (UID: \"bf1dc598-0f17-412b-a931-b2b36433ac86\") " pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.433280 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj6nb\" (UniqueName: \"kubernetes.io/projected/bf1dc598-0f17-412b-a931-b2b36433ac86-kube-api-access-rj6nb\") pod \"auto-csr-approver-29557998-68vzq\" (UID: \"bf1dc598-0f17-412b-a931-b2b36433ac86\") " pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.481056 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj6nb\" (UniqueName: \"kubernetes.io/projected/bf1dc598-0f17-412b-a931-b2b36433ac86-kube-api-access-rj6nb\") pod \"auto-csr-approver-29557998-68vzq\" (UID: \"bf1dc598-0f17-412b-a931-b2b36433ac86\") " pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.498085 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:00 crc kubenswrapper[4886]: I0314 09:18:00.998909 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-68vzq"] Mar 14 09:18:01 crc kubenswrapper[4886]: I0314 09:18:01.380527 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-68vzq" event={"ID":"bf1dc598-0f17-412b-a931-b2b36433ac86","Type":"ContainerStarted","Data":"0c721e507d5fdfa2d86238adbfc948ddadf9bf8090696fb8e1aeef18a9d58091"} Mar 14 09:18:02 crc kubenswrapper[4886]: I0314 09:18:02.392700 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-68vzq" event={"ID":"bf1dc598-0f17-412b-a931-b2b36433ac86","Type":"ContainerStarted","Data":"8249cfb299bb9e98c88d880fef326340ba11702d8a0217605e5c4c0669df38bb"} Mar 14 09:18:02 crc kubenswrapper[4886]: I0314 09:18:02.409789 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557998-68vzq" podStartSLOduration=1.542746725 podStartE2EDuration="2.409770448s" podCreationTimestamp="2026-03-14 09:18:00 +0000 UTC" firstStartedPulling="2026-03-14 09:18:01.001610985 +0000 UTC m=+3016.250062632" lastFinishedPulling="2026-03-14 09:18:01.868634728 +0000 UTC m=+3017.117086355" observedRunningTime="2026-03-14 09:18:02.407069382 +0000 UTC m=+3017.655521029" watchObservedRunningTime="2026-03-14 09:18:02.409770448 +0000 UTC m=+3017.658222085" Mar 14 09:18:03 crc kubenswrapper[4886]: I0314 09:18:03.407520 4886 generic.go:334] "Generic (PLEG): container finished" podID="bf1dc598-0f17-412b-a931-b2b36433ac86" containerID="8249cfb299bb9e98c88d880fef326340ba11702d8a0217605e5c4c0669df38bb" exitCode=0 Mar 14 09:18:03 crc kubenswrapper[4886]: I0314 09:18:03.407663 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-68vzq" event={"ID":"bf1dc598-0f17-412b-a931-b2b36433ac86","Type":"ContainerDied","Data":"8249cfb299bb9e98c88d880fef326340ba11702d8a0217605e5c4c0669df38bb"} Mar 14 09:18:04 crc kubenswrapper[4886]: I0314 09:18:04.426174 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:18:04 crc kubenswrapper[4886]: E0314 09:18:04.426836 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:18:04 crc kubenswrapper[4886]: I0314 09:18:04.917414 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:04 crc kubenswrapper[4886]: I0314 09:18:04.997252 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj6nb\" (UniqueName: \"kubernetes.io/projected/bf1dc598-0f17-412b-a931-b2b36433ac86-kube-api-access-rj6nb\") pod \"bf1dc598-0f17-412b-a931-b2b36433ac86\" (UID: \"bf1dc598-0f17-412b-a931-b2b36433ac86\") " Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.002035 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1dc598-0f17-412b-a931-b2b36433ac86-kube-api-access-rj6nb" (OuterVolumeSpecName: "kube-api-access-rj6nb") pod "bf1dc598-0f17-412b-a931-b2b36433ac86" (UID: "bf1dc598-0f17-412b-a931-b2b36433ac86"). InnerVolumeSpecName "kube-api-access-rj6nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.100950 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj6nb\" (UniqueName: \"kubernetes.io/projected/bf1dc598-0f17-412b-a931-b2b36433ac86-kube-api-access-rj6nb\") on node \"crc\" DevicePath \"\"" Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.436084 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557998-68vzq" Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.447294 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557998-68vzq" event={"ID":"bf1dc598-0f17-412b-a931-b2b36433ac86","Type":"ContainerDied","Data":"0c721e507d5fdfa2d86238adbfc948ddadf9bf8090696fb8e1aeef18a9d58091"} Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.447336 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c721e507d5fdfa2d86238adbfc948ddadf9bf8090696fb8e1aeef18a9d58091" Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.481493 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-dqcnn"] Mar 14 09:18:05 crc kubenswrapper[4886]: I0314 09:18:05.494752 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557992-dqcnn"] Mar 14 09:18:07 crc kubenswrapper[4886]: I0314 09:18:07.435726 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e881b5-4abd-4b66-90e3-b364535731ab" path="/var/lib/kubelet/pods/e7e881b5-4abd-4b66-90e3-b364535731ab/volumes" Mar 14 09:18:15 crc kubenswrapper[4886]: I0314 09:18:15.428718 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:18:15 crc kubenswrapper[4886]: E0314 09:18:15.429535 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:18:28 crc kubenswrapper[4886]: I0314 09:18:28.421389 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:18:28 crc kubenswrapper[4886]: E0314 09:18:28.422094 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:18:35 crc kubenswrapper[4886]: I0314 09:18:35.735589 4886 scope.go:117] "RemoveContainer" containerID="1218556cba3388dfd7e86e0c4238c0f5a69a54b0dc921f4c9f1b19431152bfd3" Mar 14 09:18:40 crc kubenswrapper[4886]: I0314 09:18:40.422321 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:18:40 crc kubenswrapper[4886]: E0314 09:18:40.424186 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:18:54 crc kubenswrapper[4886]: I0314 09:18:54.421201 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:18:54 crc kubenswrapper[4886]: E0314 09:18:54.421928 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.018575 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqsw8"] Mar 14 09:18:56 crc kubenswrapper[4886]: E0314 09:18:56.019824 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1dc598-0f17-412b-a931-b2b36433ac86" containerName="oc" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.019842 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1dc598-0f17-412b-a931-b2b36433ac86" containerName="oc" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.020486 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1dc598-0f17-412b-a931-b2b36433ac86" containerName="oc" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.022078 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.035812 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqsw8"] Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.166996 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-utilities\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.167092 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9zfn\" (UniqueName: \"kubernetes.io/projected/a5e0bad0-1e0c-4a21-9db6-1ec834974820-kube-api-access-b9zfn\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.167237 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-catalog-content\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.269158 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-utilities\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.269234 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9zfn\" (UniqueName: \"kubernetes.io/projected/a5e0bad0-1e0c-4a21-9db6-1ec834974820-kube-api-access-b9zfn\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.269300 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-catalog-content\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.269884 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-catalog-content\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.270035 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-utilities\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.292065 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9zfn\" (UniqueName: \"kubernetes.io/projected/a5e0bad0-1e0c-4a21-9db6-1ec834974820-kube-api-access-b9zfn\") pod \"redhat-operators-fqsw8\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.362481 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.864223 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqsw8"] Mar 14 09:18:56 crc kubenswrapper[4886]: I0314 09:18:56.992189 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerStarted","Data":"46297a1d0e07df3961a3fc72821d49999d0f7fdee72ad277806f839d4b7a9d78"} Mar 14 09:18:58 crc kubenswrapper[4886]: I0314 09:18:58.004614 4886 generic.go:334] "Generic (PLEG): container finished" podID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerID="86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079" exitCode=0 Mar 14 09:18:58 crc kubenswrapper[4886]: I0314 09:18:58.004703 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerDied","Data":"86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079"} Mar 14 09:18:59 crc kubenswrapper[4886]: I0314 09:18:59.018439 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerStarted","Data":"aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1"} Mar 14 09:19:04 crc kubenswrapper[4886]: I0314 09:19:04.073293 4886 generic.go:334] "Generic (PLEG): container finished" podID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerID="aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1" exitCode=0 Mar 14 09:19:04 crc kubenswrapper[4886]: I0314 09:19:04.073575 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerDied","Data":"aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1"} Mar 14 09:19:05 crc kubenswrapper[4886]: I0314 09:19:05.087424 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerStarted","Data":"99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3"} Mar 14 09:19:05 crc kubenswrapper[4886]: I0314 09:19:05.111625 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqsw8" podStartSLOduration=3.6054505690000003 podStartE2EDuration="10.111610968s" podCreationTimestamp="2026-03-14 09:18:55 +0000 UTC" firstStartedPulling="2026-03-14 09:18:58.006898452 +0000 UTC m=+3073.255350099" lastFinishedPulling="2026-03-14 09:19:04.513058861 +0000 UTC m=+3079.761510498" observedRunningTime="2026-03-14 09:19:05.108579102 +0000 UTC m=+3080.357030739" watchObservedRunningTime="2026-03-14 09:19:05.111610968 +0000 UTC m=+3080.360062605" Mar 14 09:19:06 crc kubenswrapper[4886]: I0314 09:19:06.363552 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:19:06 crc kubenswrapper[4886]: I0314 09:19:06.363935 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:19:07 crc kubenswrapper[4886]: I0314 09:19:07.409583 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqsw8" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" probeResult="failure" output=< Mar 14 09:19:07 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:19:07 crc kubenswrapper[4886]: > Mar 14 09:19:08 crc kubenswrapper[4886]: I0314 09:19:08.420428 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:19:08 crc kubenswrapper[4886]: E0314 09:19:08.420702 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:19:17 crc kubenswrapper[4886]: I0314 09:19:17.424317 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqsw8" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" probeResult="failure" output=< Mar 14 09:19:17 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:19:17 crc kubenswrapper[4886]: > Mar 14 09:19:23 crc kubenswrapper[4886]: I0314 09:19:23.420931 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:19:23 crc kubenswrapper[4886]: E0314 09:19:23.421717 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:19:27 crc kubenswrapper[4886]: I0314 09:19:27.436270 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqsw8" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" probeResult="failure" output=< Mar 14 09:19:27 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:19:27 crc kubenswrapper[4886]: > Mar 14 09:19:35 crc kubenswrapper[4886]: I0314 09:19:35.434047 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:19:35 crc kubenswrapper[4886]: E0314 09:19:35.434979 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:19:36 crc kubenswrapper[4886]: I0314 09:19:36.432443 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:19:36 crc kubenswrapper[4886]: I0314 09:19:36.489247 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:19:36 crc kubenswrapper[4886]: I0314 09:19:36.682891 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqsw8"] Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.461204 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqsw8" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" containerID="cri-o://99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3" gracePeriod=2 Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.920365 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.973238 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-catalog-content\") pod \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.973485 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-utilities\") pod \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.973628 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9zfn\" (UniqueName: \"kubernetes.io/projected/a5e0bad0-1e0c-4a21-9db6-1ec834974820-kube-api-access-b9zfn\") pod \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\" (UID: \"a5e0bad0-1e0c-4a21-9db6-1ec834974820\") " Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.974376 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-utilities" (OuterVolumeSpecName: "utilities") pod "a5e0bad0-1e0c-4a21-9db6-1ec834974820" (UID: "a5e0bad0-1e0c-4a21-9db6-1ec834974820"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:19:38 crc kubenswrapper[4886]: I0314 09:19:38.979830 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e0bad0-1e0c-4a21-9db6-1ec834974820-kube-api-access-b9zfn" (OuterVolumeSpecName: "kube-api-access-b9zfn") pod "a5e0bad0-1e0c-4a21-9db6-1ec834974820" (UID: "a5e0bad0-1e0c-4a21-9db6-1ec834974820"). InnerVolumeSpecName "kube-api-access-b9zfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.076823 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.076877 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9zfn\" (UniqueName: \"kubernetes.io/projected/a5e0bad0-1e0c-4a21-9db6-1ec834974820-kube-api-access-b9zfn\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.103619 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5e0bad0-1e0c-4a21-9db6-1ec834974820" (UID: "a5e0bad0-1e0c-4a21-9db6-1ec834974820"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.179265 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e0bad0-1e0c-4a21-9db6-1ec834974820-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.476744 4886 generic.go:334] "Generic (PLEG): container finished" podID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerID="99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3" exitCode=0 Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.476791 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerDied","Data":"99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3"} Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.476833 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqsw8" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.476841 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqsw8" event={"ID":"a5e0bad0-1e0c-4a21-9db6-1ec834974820","Type":"ContainerDied","Data":"46297a1d0e07df3961a3fc72821d49999d0f7fdee72ad277806f839d4b7a9d78"} Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.476882 4886 scope.go:117] "RemoveContainer" containerID="99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.508169 4886 scope.go:117] "RemoveContainer" containerID="aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.519320 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqsw8"] Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.537031 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqsw8"] Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.541299 4886 scope.go:117] "RemoveContainer" containerID="86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.599820 4886 scope.go:117] "RemoveContainer" containerID="99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3" Mar 14 09:19:39 crc kubenswrapper[4886]: E0314 09:19:39.600259 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3\": container with ID starting with 99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3 not found: ID does not exist" containerID="99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.600290 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3"} err="failed to get container status \"99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3\": rpc error: code = NotFound desc = could not find container \"99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3\": container with ID starting with 99c7548cc91532fab73ff858e33e41cf6822ea4fc695169b164324bd578cb6c3 not found: ID does not exist" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.600312 4886 scope.go:117] "RemoveContainer" containerID="aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1" Mar 14 09:19:39 crc kubenswrapper[4886]: E0314 09:19:39.600681 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1\": container with ID starting with aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1 not found: ID does not exist" containerID="aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.600836 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1"} err="failed to get container status \"aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1\": rpc error: code = NotFound desc = could not find container \"aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1\": container with ID starting with aaf0f84b842b0bcf4413c8a15c69e45a17e36353b5e9542f2a343e71c1157de1 not found: ID does not exist" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.600973 4886 scope.go:117] "RemoveContainer" containerID="86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079" Mar 14 09:19:39 crc kubenswrapper[4886]: E0314 09:19:39.601420 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079\": container with ID starting with 86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079 not found: ID does not exist" containerID="86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079" Mar 14 09:19:39 crc kubenswrapper[4886]: I0314 09:19:39.601449 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079"} err="failed to get container status \"86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079\": rpc error: code = NotFound desc = could not find container \"86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079\": container with ID starting with 86b9a68ae27e974b79a26c67e0fe6a2d0801a4210dcd85f31bd303e785afd079 not found: ID does not exist" Mar 14 09:19:41 crc kubenswrapper[4886]: I0314 09:19:41.472694 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" path="/var/lib/kubelet/pods/a5e0bad0-1e0c-4a21-9db6-1ec834974820/volumes" Mar 14 09:19:50 crc kubenswrapper[4886]: I0314 09:19:50.421392 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:19:50 crc kubenswrapper[4886]: E0314 09:19:50.423349 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.171002 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558000-ck7f2"] Mar 14 09:20:00 crc kubenswrapper[4886]: E0314 09:20:00.173771 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.173810 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" Mar 14 09:20:00 crc kubenswrapper[4886]: E0314 09:20:00.173834 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="extract-content" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.173847 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="extract-content" Mar 14 09:20:00 crc kubenswrapper[4886]: E0314 09:20:00.173905 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="extract-utilities" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.173917 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="extract-utilities" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.174304 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e0bad0-1e0c-4a21-9db6-1ec834974820" containerName="registry-server" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.175894 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.179508 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.179612 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.179637 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.188421 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-ck7f2"] Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.266053 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdq5z\" (UniqueName: \"kubernetes.io/projected/21cfad98-0ad0-471a-bad2-23a600a2f434-kube-api-access-pdq5z\") pod \"auto-csr-approver-29558000-ck7f2\" (UID: \"21cfad98-0ad0-471a-bad2-23a600a2f434\") " pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.368577 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdq5z\" (UniqueName: \"kubernetes.io/projected/21cfad98-0ad0-471a-bad2-23a600a2f434-kube-api-access-pdq5z\") pod \"auto-csr-approver-29558000-ck7f2\" (UID: \"21cfad98-0ad0-471a-bad2-23a600a2f434\") " pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.406810 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdq5z\" (UniqueName: \"kubernetes.io/projected/21cfad98-0ad0-471a-bad2-23a600a2f434-kube-api-access-pdq5z\") pod \"auto-csr-approver-29558000-ck7f2\" (UID: \"21cfad98-0ad0-471a-bad2-23a600a2f434\") " pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:00 crc kubenswrapper[4886]: I0314 09:20:00.500410 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:01 crc kubenswrapper[4886]: I0314 09:20:01.002107 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-ck7f2"] Mar 14 09:20:01 crc kubenswrapper[4886]: I0314 09:20:01.420754 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:20:01 crc kubenswrapper[4886]: E0314 09:20:01.421039 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:20:01 crc kubenswrapper[4886]: I0314 09:20:01.721098 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" event={"ID":"21cfad98-0ad0-471a-bad2-23a600a2f434","Type":"ContainerStarted","Data":"3cdf8e4efeccf2d155b5a3d8a7fe102e945caba7e68f4431f1be9cdd54bd6aa5"} Mar 14 09:20:02 crc kubenswrapper[4886]: I0314 09:20:02.733678 4886 generic.go:334] "Generic (PLEG): container finished" podID="21cfad98-0ad0-471a-bad2-23a600a2f434" containerID="fe451428c85793e7b9a29a42c1ab49c8425d6d77813a5287b0b77a2fd9e55c5b" exitCode=0 Mar 14 09:20:02 crc kubenswrapper[4886]: I0314 09:20:02.733765 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" event={"ID":"21cfad98-0ad0-471a-bad2-23a600a2f434","Type":"ContainerDied","Data":"fe451428c85793e7b9a29a42c1ab49c8425d6d77813a5287b0b77a2fd9e55c5b"} Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.142794 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.286965 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdq5z\" (UniqueName: \"kubernetes.io/projected/21cfad98-0ad0-471a-bad2-23a600a2f434-kube-api-access-pdq5z\") pod \"21cfad98-0ad0-471a-bad2-23a600a2f434\" (UID: \"21cfad98-0ad0-471a-bad2-23a600a2f434\") " Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.294440 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cfad98-0ad0-471a-bad2-23a600a2f434-kube-api-access-pdq5z" (OuterVolumeSpecName: "kube-api-access-pdq5z") pod "21cfad98-0ad0-471a-bad2-23a600a2f434" (UID: "21cfad98-0ad0-471a-bad2-23a600a2f434"). InnerVolumeSpecName "kube-api-access-pdq5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.389483 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdq5z\" (UniqueName: \"kubernetes.io/projected/21cfad98-0ad0-471a-bad2-23a600a2f434-kube-api-access-pdq5z\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.657497 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gm7d2"] Mar 14 09:20:04 crc kubenswrapper[4886]: E0314 09:20:04.658310 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cfad98-0ad0-471a-bad2-23a600a2f434" containerName="oc" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.658343 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cfad98-0ad0-471a-bad2-23a600a2f434" containerName="oc" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.658704 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cfad98-0ad0-471a-bad2-23a600a2f434" containerName="oc" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.661255 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.691542 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gm7d2"] Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.779480 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" event={"ID":"21cfad98-0ad0-471a-bad2-23a600a2f434","Type":"ContainerDied","Data":"3cdf8e4efeccf2d155b5a3d8a7fe102e945caba7e68f4431f1be9cdd54bd6aa5"} Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.779850 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cdf8e4efeccf2d155b5a3d8a7fe102e945caba7e68f4431f1be9cdd54bd6aa5" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.779557 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558000-ck7f2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.797087 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-utilities\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.797264 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-catalog-content\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.797400 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8khkp\" (UniqueName: \"kubernetes.io/projected/cb994644-eecf-44bf-928d-7196a6724903-kube-api-access-8khkp\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.899558 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8khkp\" (UniqueName: \"kubernetes.io/projected/cb994644-eecf-44bf-928d-7196a6724903-kube-api-access-8khkp\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.899631 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-utilities\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.899746 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-catalog-content\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.900361 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-utilities\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.900457 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-catalog-content\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.924458 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8khkp\" (UniqueName: \"kubernetes.io/projected/cb994644-eecf-44bf-928d-7196a6724903-kube-api-access-8khkp\") pod \"community-operators-gm7d2\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:04 crc kubenswrapper[4886]: I0314 09:20:04.987163 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.227734 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-qtxzf"] Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.236421 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557994-qtxzf"] Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.433513 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f7a6b6-9185-4d28-9187-dbbd9819ba61" path="/var/lib/kubelet/pods/e9f7a6b6-9185-4d28-9187-dbbd9819ba61/volumes" Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.491572 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gm7d2"] Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.792008 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerDied","Data":"b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185"} Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.791927 4886 generic.go:334] "Generic (PLEG): container finished" podID="cb994644-eecf-44bf-928d-7196a6724903" containerID="b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185" exitCode=0 Mar 14 09:20:05 crc kubenswrapper[4886]: I0314 09:20:05.792623 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerStarted","Data":"6c05996177fb994333f7e24afed1f9534448c5f8b9098da3a029f16c163c9b00"} Mar 14 09:20:06 crc kubenswrapper[4886]: I0314 09:20:06.809804 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerStarted","Data":"dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19"} Mar 14 09:20:08 crc kubenswrapper[4886]: I0314 09:20:08.839209 4886 generic.go:334] "Generic (PLEG): container finished" podID="cb994644-eecf-44bf-928d-7196a6724903" containerID="dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19" exitCode=0 Mar 14 09:20:08 crc kubenswrapper[4886]: I0314 09:20:08.839320 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerDied","Data":"dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19"} Mar 14 09:20:09 crc kubenswrapper[4886]: I0314 09:20:09.852568 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerStarted","Data":"f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe"} Mar 14 09:20:09 crc kubenswrapper[4886]: I0314 09:20:09.912786 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gm7d2" podStartSLOduration=2.2170535510000002 podStartE2EDuration="5.912767681s" podCreationTimestamp="2026-03-14 09:20:04 +0000 UTC" firstStartedPulling="2026-03-14 09:20:05.79515053 +0000 UTC m=+3141.043602167" lastFinishedPulling="2026-03-14 09:20:09.49086465 +0000 UTC m=+3144.739316297" observedRunningTime="2026-03-14 09:20:09.906399629 +0000 UTC m=+3145.154851266" watchObservedRunningTime="2026-03-14 09:20:09.912767681 +0000 UTC m=+3145.161219318" Mar 14 09:20:14 crc kubenswrapper[4886]: I0314 09:20:14.988444 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:14 crc kubenswrapper[4886]: I0314 09:20:14.989032 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:15 crc kubenswrapper[4886]: I0314 09:20:15.058676 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:15 crc kubenswrapper[4886]: I0314 09:20:15.433474 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:20:15 crc kubenswrapper[4886]: E0314 09:20:15.433781 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:20:15 crc kubenswrapper[4886]: I0314 09:20:15.982924 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:16 crc kubenswrapper[4886]: I0314 09:20:16.039956 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gm7d2"] Mar 14 09:20:17 crc kubenswrapper[4886]: I0314 09:20:17.933442 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gm7d2" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="registry-server" containerID="cri-o://f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe" gracePeriod=2 Mar 14 09:20:18 crc kubenswrapper[4886]: E0314 09:20:18.287594 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb994644_eecf_44bf_928d_7196a6724903.slice/crio-conmon-f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.675798 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.819018 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-utilities\") pod \"cb994644-eecf-44bf-928d-7196a6724903\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.819289 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8khkp\" (UniqueName: \"kubernetes.io/projected/cb994644-eecf-44bf-928d-7196a6724903-kube-api-access-8khkp\") pod \"cb994644-eecf-44bf-928d-7196a6724903\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.819322 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-catalog-content\") pod \"cb994644-eecf-44bf-928d-7196a6724903\" (UID: \"cb994644-eecf-44bf-928d-7196a6724903\") " Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.819787 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-utilities" (OuterVolumeSpecName: "utilities") pod "cb994644-eecf-44bf-928d-7196a6724903" (UID: "cb994644-eecf-44bf-928d-7196a6724903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.832489 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb994644-eecf-44bf-928d-7196a6724903-kube-api-access-8khkp" (OuterVolumeSpecName: "kube-api-access-8khkp") pod "cb994644-eecf-44bf-928d-7196a6724903" (UID: "cb994644-eecf-44bf-928d-7196a6724903"). InnerVolumeSpecName "kube-api-access-8khkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.871065 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb994644-eecf-44bf-928d-7196a6724903" (UID: "cb994644-eecf-44bf-928d-7196a6724903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.922338 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.922388 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8khkp\" (UniqueName: \"kubernetes.io/projected/cb994644-eecf-44bf-928d-7196a6724903-kube-api-access-8khkp\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.922403 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb994644-eecf-44bf-928d-7196a6724903-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.945912 4886 generic.go:334] "Generic (PLEG): container finished" podID="cb994644-eecf-44bf-928d-7196a6724903" containerID="f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe" exitCode=0 Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.945993 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerDied","Data":"f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe"} Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.946054 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gm7d2" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.946295 4886 scope.go:117] "RemoveContainer" containerID="f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.946275 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gm7d2" event={"ID":"cb994644-eecf-44bf-928d-7196a6724903","Type":"ContainerDied","Data":"6c05996177fb994333f7e24afed1f9534448c5f8b9098da3a029f16c163c9b00"} Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.966020 4886 scope.go:117] "RemoveContainer" containerID="dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19" Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.992911 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gm7d2"] Mar 14 09:20:18 crc kubenswrapper[4886]: I0314 09:20:18.996662 4886 scope.go:117] "RemoveContainer" containerID="b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.003227 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gm7d2"] Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.036491 4886 scope.go:117] "RemoveContainer" containerID="f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe" Mar 14 09:20:19 crc kubenswrapper[4886]: E0314 09:20:19.036999 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe\": container with ID starting with f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe not found: ID does not exist" containerID="f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.037035 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe"} err="failed to get container status \"f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe\": rpc error: code = NotFound desc = could not find container \"f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe\": container with ID starting with f2ca73f4af060c023202f89f99e08e6d9a3a52c6312e21c107107868e0f7f6fe not found: ID does not exist" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.037059 4886 scope.go:117] "RemoveContainer" containerID="dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19" Mar 14 09:20:19 crc kubenswrapper[4886]: E0314 09:20:19.037451 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19\": container with ID starting with dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19 not found: ID does not exist" containerID="dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.037471 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19"} err="failed to get container status \"dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19\": rpc error: code = NotFound desc = could not find container \"dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19\": container with ID starting with dbfbcddf00f916e92b849d2b2a251a95e873730fb2641f9780282e59d70c4f19 not found: ID does not exist" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.037483 4886 scope.go:117] "RemoveContainer" containerID="b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185" Mar 14 09:20:19 crc kubenswrapper[4886]: E0314 09:20:19.037848 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185\": container with ID starting with b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185 not found: ID does not exist" containerID="b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.037867 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185"} err="failed to get container status \"b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185\": rpc error: code = NotFound desc = could not find container \"b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185\": container with ID starting with b6636e6f3a8c14db2f293a3e48342184a80845d0d885d151922067b671637185 not found: ID does not exist" Mar 14 09:20:19 crc kubenswrapper[4886]: I0314 09:20:19.433782 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb994644-eecf-44bf-928d-7196a6724903" path="/var/lib/kubelet/pods/cb994644-eecf-44bf-928d-7196a6724903/volumes" Mar 14 09:20:27 crc kubenswrapper[4886]: I0314 09:20:27.421522 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:20:27 crc kubenswrapper[4886]: E0314 09:20:27.422291 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:20:35 crc kubenswrapper[4886]: I0314 09:20:35.837348 4886 scope.go:117] "RemoveContainer" containerID="5d19ac2d4696fcc4a3165800bdfcd53c9ff24e366c7e18ca1ccbd4cde1119358" Mar 14 09:20:38 crc kubenswrapper[4886]: I0314 09:20:38.420943 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:20:38 crc kubenswrapper[4886]: E0314 09:20:38.422272 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:20:51 crc kubenswrapper[4886]: I0314 09:20:51.421614 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:20:51 crc kubenswrapper[4886]: E0314 09:20:51.422810 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:21:04 crc kubenswrapper[4886]: I0314 09:21:04.420785 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:21:04 crc kubenswrapper[4886]: E0314 09:21:04.421491 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:21:15 crc kubenswrapper[4886]: I0314 09:21:15.448982 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:21:15 crc kubenswrapper[4886]: E0314 09:21:15.449971 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:21:27 crc kubenswrapper[4886]: I0314 09:21:27.212351 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-54lvd" podUID="a1dc9df2-9dd0-40af-9508-b65d1047b045" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 14 09:21:27 crc kubenswrapper[4886]: I0314 09:21:27.420616 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:21:27 crc kubenswrapper[4886]: E0314 09:21:27.422180 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:21:38 crc kubenswrapper[4886]: I0314 09:21:38.420860 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:21:38 crc kubenswrapper[4886]: E0314 09:21:38.421672 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:21:53 crc kubenswrapper[4886]: I0314 09:21:53.421312 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:21:53 crc kubenswrapper[4886]: E0314 09:21:53.422095 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.173655 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558002-gk6tg"] Mar 14 09:22:00 crc kubenswrapper[4886]: E0314 09:22:00.174602 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="extract-utilities" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.174618 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="extract-utilities" Mar 14 09:22:00 crc kubenswrapper[4886]: E0314 09:22:00.174643 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="extract-content" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.174649 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="extract-content" Mar 14 09:22:00 crc kubenswrapper[4886]: E0314 09:22:00.174673 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.174679 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.174877 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb994644-eecf-44bf-928d-7196a6724903" containerName="registry-server" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.175630 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.178377 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.180828 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.182924 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.191220 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-gk6tg"] Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.346559 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lmt\" (UniqueName: \"kubernetes.io/projected/29680d59-2279-4f28-aba6-b3e41fd622e1-kube-api-access-99lmt\") pod \"auto-csr-approver-29558002-gk6tg\" (UID: \"29680d59-2279-4f28-aba6-b3e41fd622e1\") " pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.449152 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lmt\" (UniqueName: \"kubernetes.io/projected/29680d59-2279-4f28-aba6-b3e41fd622e1-kube-api-access-99lmt\") pod \"auto-csr-approver-29558002-gk6tg\" (UID: \"29680d59-2279-4f28-aba6-b3e41fd622e1\") " pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.476308 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lmt\" (UniqueName: \"kubernetes.io/projected/29680d59-2279-4f28-aba6-b3e41fd622e1-kube-api-access-99lmt\") pod \"auto-csr-approver-29558002-gk6tg\" (UID: \"29680d59-2279-4f28-aba6-b3e41fd622e1\") " pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.495285 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.962968 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-gk6tg"] Mar 14 09:22:00 crc kubenswrapper[4886]: I0314 09:22:00.969858 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:22:01 crc kubenswrapper[4886]: I0314 09:22:01.559363 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" event={"ID":"29680d59-2279-4f28-aba6-b3e41fd622e1","Type":"ContainerStarted","Data":"010e908db5da58b0c02c9e1f09c85bf2e4e6dc4b9be39fffeef00a7e8aa35772"} Mar 14 09:22:02 crc kubenswrapper[4886]: I0314 09:22:02.568725 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" event={"ID":"29680d59-2279-4f28-aba6-b3e41fd622e1","Type":"ContainerStarted","Data":"818e2f6fa2781d837b946858def1009e991e3b3bef64ccec5ade2343bbf6f83e"} Mar 14 09:22:02 crc kubenswrapper[4886]: I0314 09:22:02.585143 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" podStartSLOduration=1.428404333 podStartE2EDuration="2.585103956s" podCreationTimestamp="2026-03-14 09:22:00 +0000 UTC" firstStartedPulling="2026-03-14 09:22:00.969293022 +0000 UTC m=+3256.217744659" lastFinishedPulling="2026-03-14 09:22:02.125992635 +0000 UTC m=+3257.374444282" observedRunningTime="2026-03-14 09:22:02.580002331 +0000 UTC m=+3257.828453958" watchObservedRunningTime="2026-03-14 09:22:02.585103956 +0000 UTC m=+3257.833555603" Mar 14 09:22:03 crc kubenswrapper[4886]: I0314 09:22:03.579346 4886 generic.go:334] "Generic (PLEG): container finished" podID="29680d59-2279-4f28-aba6-b3e41fd622e1" containerID="818e2f6fa2781d837b946858def1009e991e3b3bef64ccec5ade2343bbf6f83e" exitCode=0 Mar 14 09:22:03 crc kubenswrapper[4886]: I0314 09:22:03.579394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" event={"ID":"29680d59-2279-4f28-aba6-b3e41fd622e1","Type":"ContainerDied","Data":"818e2f6fa2781d837b946858def1009e991e3b3bef64ccec5ade2343bbf6f83e"} Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.084955 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.172295 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99lmt\" (UniqueName: \"kubernetes.io/projected/29680d59-2279-4f28-aba6-b3e41fd622e1-kube-api-access-99lmt\") pod \"29680d59-2279-4f28-aba6-b3e41fd622e1\" (UID: \"29680d59-2279-4f28-aba6-b3e41fd622e1\") " Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.183562 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29680d59-2279-4f28-aba6-b3e41fd622e1-kube-api-access-99lmt" (OuterVolumeSpecName: "kube-api-access-99lmt") pod "29680d59-2279-4f28-aba6-b3e41fd622e1" (UID: "29680d59-2279-4f28-aba6-b3e41fd622e1"). InnerVolumeSpecName "kube-api-access-99lmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.275164 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99lmt\" (UniqueName: \"kubernetes.io/projected/29680d59-2279-4f28-aba6-b3e41fd622e1-kube-api-access-99lmt\") on node \"crc\" DevicePath \"\"" Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.428097 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.602780 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" event={"ID":"29680d59-2279-4f28-aba6-b3e41fd622e1","Type":"ContainerDied","Data":"010e908db5da58b0c02c9e1f09c85bf2e4e6dc4b9be39fffeef00a7e8aa35772"} Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.603111 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="010e908db5da58b0c02c9e1f09c85bf2e4e6dc4b9be39fffeef00a7e8aa35772" Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.602830 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558002-gk6tg" Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.681495 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-7cp2k"] Mar 14 09:22:05 crc kubenswrapper[4886]: I0314 09:22:05.695379 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557996-7cp2k"] Mar 14 09:22:06 crc kubenswrapper[4886]: I0314 09:22:06.613929 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"cd6387aefbce165fc3b494f0eeffc43316107d6f7697eadc4f4e5177687d48f9"} Mar 14 09:22:07 crc kubenswrapper[4886]: I0314 09:22:07.438961 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae" path="/var/lib/kubelet/pods/fd34e59d-c7f3-48bd-bbc0-d71723e4f1ae/volumes" Mar 14 09:22:35 crc kubenswrapper[4886]: I0314 09:22:35.950184 4886 scope.go:117] "RemoveContainer" containerID="a3f27af38b9cecc66d079900902015352c8657e68c43fd7eceba26903b7b9d6f" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.158133 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558004-4schh"] Mar 14 09:24:00 crc kubenswrapper[4886]: E0314 09:24:00.159674 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29680d59-2279-4f28-aba6-b3e41fd622e1" containerName="oc" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.159696 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="29680d59-2279-4f28-aba6-b3e41fd622e1" containerName="oc" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.160028 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="29680d59-2279-4f28-aba6-b3e41fd622e1" containerName="oc" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.161309 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.164341 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.164806 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.166047 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.173906 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-4schh"] Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.291910 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh77l\" (UniqueName: \"kubernetes.io/projected/46b443cc-0bf1-4fa4-890d-60a52d4c14b3-kube-api-access-wh77l\") pod \"auto-csr-approver-29558004-4schh\" (UID: \"46b443cc-0bf1-4fa4-890d-60a52d4c14b3\") " pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.393937 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh77l\" (UniqueName: \"kubernetes.io/projected/46b443cc-0bf1-4fa4-890d-60a52d4c14b3-kube-api-access-wh77l\") pod \"auto-csr-approver-29558004-4schh\" (UID: \"46b443cc-0bf1-4fa4-890d-60a52d4c14b3\") " pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.421233 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh77l\" (UniqueName: \"kubernetes.io/projected/46b443cc-0bf1-4fa4-890d-60a52d4c14b3-kube-api-access-wh77l\") pod \"auto-csr-approver-29558004-4schh\" (UID: \"46b443cc-0bf1-4fa4-890d-60a52d4c14b3\") " pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.483699 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:00 crc kubenswrapper[4886]: I0314 09:24:00.977366 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-4schh"] Mar 14 09:24:00 crc kubenswrapper[4886]: W0314 09:24:00.989049 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46b443cc_0bf1_4fa4_890d_60a52d4c14b3.slice/crio-e3fcd52153a00a7b0fe21ef2fd9f5a76c06db8c5a6b78597773f7bb6be2c825a WatchSource:0}: Error finding container e3fcd52153a00a7b0fe21ef2fd9f5a76c06db8c5a6b78597773f7bb6be2c825a: Status 404 returned error can't find the container with id e3fcd52153a00a7b0fe21ef2fd9f5a76c06db8c5a6b78597773f7bb6be2c825a Mar 14 09:24:01 crc kubenswrapper[4886]: I0314 09:24:01.717545 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-4schh" event={"ID":"46b443cc-0bf1-4fa4-890d-60a52d4c14b3","Type":"ContainerStarted","Data":"e3fcd52153a00a7b0fe21ef2fd9f5a76c06db8c5a6b78597773f7bb6be2c825a"} Mar 14 09:24:02 crc kubenswrapper[4886]: I0314 09:24:02.725807 4886 generic.go:334] "Generic (PLEG): container finished" podID="46b443cc-0bf1-4fa4-890d-60a52d4c14b3" containerID="88b3a786c5e5457174b81479bd5672a9f30aeb3474291ee632512ec99043a3f1" exitCode=0 Mar 14 09:24:02 crc kubenswrapper[4886]: I0314 09:24:02.725868 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-4schh" event={"ID":"46b443cc-0bf1-4fa4-890d-60a52d4c14b3","Type":"ContainerDied","Data":"88b3a786c5e5457174b81479bd5672a9f30aeb3474291ee632512ec99043a3f1"} Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.104717 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.283363 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh77l\" (UniqueName: \"kubernetes.io/projected/46b443cc-0bf1-4fa4-890d-60a52d4c14b3-kube-api-access-wh77l\") pod \"46b443cc-0bf1-4fa4-890d-60a52d4c14b3\" (UID: \"46b443cc-0bf1-4fa4-890d-60a52d4c14b3\") " Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.291256 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b443cc-0bf1-4fa4-890d-60a52d4c14b3-kube-api-access-wh77l" (OuterVolumeSpecName: "kube-api-access-wh77l") pod "46b443cc-0bf1-4fa4-890d-60a52d4c14b3" (UID: "46b443cc-0bf1-4fa4-890d-60a52d4c14b3"). InnerVolumeSpecName "kube-api-access-wh77l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.386539 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh77l\" (UniqueName: \"kubernetes.io/projected/46b443cc-0bf1-4fa4-890d-60a52d4c14b3-kube-api-access-wh77l\") on node \"crc\" DevicePath \"\"" Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.754820 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558004-4schh" event={"ID":"46b443cc-0bf1-4fa4-890d-60a52d4c14b3","Type":"ContainerDied","Data":"e3fcd52153a00a7b0fe21ef2fd9f5a76c06db8c5a6b78597773f7bb6be2c825a"} Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.755378 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3fcd52153a00a7b0fe21ef2fd9f5a76c06db8c5a6b78597773f7bb6be2c825a" Mar 14 09:24:04 crc kubenswrapper[4886]: I0314 09:24:04.755443 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558004-4schh" Mar 14 09:24:05 crc kubenswrapper[4886]: I0314 09:24:05.180447 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-68vzq"] Mar 14 09:24:05 crc kubenswrapper[4886]: I0314 09:24:05.188704 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557998-68vzq"] Mar 14 09:24:05 crc kubenswrapper[4886]: I0314 09:24:05.432112 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1dc598-0f17-412b-a931-b2b36433ac86" path="/var/lib/kubelet/pods/bf1dc598-0f17-412b-a931-b2b36433ac86/volumes" Mar 14 09:24:26 crc kubenswrapper[4886]: I0314 09:24:26.066281 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:24:26 crc kubenswrapper[4886]: I0314 09:24:26.066881 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:24:36 crc kubenswrapper[4886]: I0314 09:24:36.096369 4886 scope.go:117] "RemoveContainer" containerID="8249cfb299bb9e98c88d880fef326340ba11702d8a0217605e5c4c0669df38bb" Mar 14 09:24:56 crc kubenswrapper[4886]: I0314 09:24:56.066071 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:24:56 crc kubenswrapper[4886]: I0314 09:24:56.066772 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:25:26 crc kubenswrapper[4886]: I0314 09:25:26.065635 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:25:26 crc kubenswrapper[4886]: I0314 09:25:26.066424 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:25:26 crc kubenswrapper[4886]: I0314 09:25:26.066498 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:25:26 crc kubenswrapper[4886]: I0314 09:25:26.067512 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd6387aefbce165fc3b494f0eeffc43316107d6f7697eadc4f4e5177687d48f9"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:25:26 crc kubenswrapper[4886]: I0314 09:25:26.067578 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://cd6387aefbce165fc3b494f0eeffc43316107d6f7697eadc4f4e5177687d48f9" gracePeriod=600 Mar 14 09:25:27 crc kubenswrapper[4886]: I0314 09:25:27.599972 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="cd6387aefbce165fc3b494f0eeffc43316107d6f7697eadc4f4e5177687d48f9" exitCode=0 Mar 14 09:25:27 crc kubenswrapper[4886]: I0314 09:25:27.600648 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"cd6387aefbce165fc3b494f0eeffc43316107d6f7697eadc4f4e5177687d48f9"} Mar 14 09:25:27 crc kubenswrapper[4886]: I0314 09:25:27.600683 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee"} Mar 14 09:25:27 crc kubenswrapper[4886]: I0314 09:25:27.600706 4886 scope.go:117] "RemoveContainer" containerID="5e4f4b427446a979a09695fedf2c2b8cb6d79276728ee2dd8e10c41fa4b003ba" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.148149 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558006-5hvlv"] Mar 14 09:26:00 crc kubenswrapper[4886]: E0314 09:26:00.149080 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b443cc-0bf1-4fa4-890d-60a52d4c14b3" containerName="oc" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.149093 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b443cc-0bf1-4fa4-890d-60a52d4c14b3" containerName="oc" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.149348 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b443cc-0bf1-4fa4-890d-60a52d4c14b3" containerName="oc" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.150163 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.152714 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.154414 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.154504 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.157904 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-5hvlv"] Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.286871 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9sg\" (UniqueName: \"kubernetes.io/projected/5620551c-b879-4c48-9750-5680d63676ce-kube-api-access-6x9sg\") pod \"auto-csr-approver-29558006-5hvlv\" (UID: \"5620551c-b879-4c48-9750-5680d63676ce\") " pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.388706 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9sg\" (UniqueName: \"kubernetes.io/projected/5620551c-b879-4c48-9750-5680d63676ce-kube-api-access-6x9sg\") pod \"auto-csr-approver-29558006-5hvlv\" (UID: \"5620551c-b879-4c48-9750-5680d63676ce\") " pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.411939 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9sg\" (UniqueName: \"kubernetes.io/projected/5620551c-b879-4c48-9750-5680d63676ce-kube-api-access-6x9sg\") pod \"auto-csr-approver-29558006-5hvlv\" (UID: \"5620551c-b879-4c48-9750-5680d63676ce\") " pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.465992 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:00 crc kubenswrapper[4886]: I0314 09:26:00.941718 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-5hvlv"] Mar 14 09:26:01 crc kubenswrapper[4886]: I0314 09:26:01.412312 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" event={"ID":"5620551c-b879-4c48-9750-5680d63676ce","Type":"ContainerStarted","Data":"1642ede75b0b8359960ff24faff90e3e5a5c7608549c51fdade6a8a8806113c6"} Mar 14 09:26:02 crc kubenswrapper[4886]: I0314 09:26:02.422617 4886 generic.go:334] "Generic (PLEG): container finished" podID="5620551c-b879-4c48-9750-5680d63676ce" containerID="f265b11ffe6cab851fd431528e2d42d177708fe1ecfba753a4b7255205c020a0" exitCode=0 Mar 14 09:26:02 crc kubenswrapper[4886]: I0314 09:26:02.422721 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" event={"ID":"5620551c-b879-4c48-9750-5680d63676ce","Type":"ContainerDied","Data":"f265b11ffe6cab851fd431528e2d42d177708fe1ecfba753a4b7255205c020a0"} Mar 14 09:26:03 crc kubenswrapper[4886]: I0314 09:26:03.824367 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:03 crc kubenswrapper[4886]: I0314 09:26:03.967624 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x9sg\" (UniqueName: \"kubernetes.io/projected/5620551c-b879-4c48-9750-5680d63676ce-kube-api-access-6x9sg\") pod \"5620551c-b879-4c48-9750-5680d63676ce\" (UID: \"5620551c-b879-4c48-9750-5680d63676ce\") " Mar 14 09:26:03 crc kubenswrapper[4886]: I0314 09:26:03.976143 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5620551c-b879-4c48-9750-5680d63676ce-kube-api-access-6x9sg" (OuterVolumeSpecName: "kube-api-access-6x9sg") pod "5620551c-b879-4c48-9750-5680d63676ce" (UID: "5620551c-b879-4c48-9750-5680d63676ce"). InnerVolumeSpecName "kube-api-access-6x9sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:04 crc kubenswrapper[4886]: I0314 09:26:04.070568 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x9sg\" (UniqueName: \"kubernetes.io/projected/5620551c-b879-4c48-9750-5680d63676ce-kube-api-access-6x9sg\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:04 crc kubenswrapper[4886]: I0314 09:26:04.442364 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" event={"ID":"5620551c-b879-4c48-9750-5680d63676ce","Type":"ContainerDied","Data":"1642ede75b0b8359960ff24faff90e3e5a5c7608549c51fdade6a8a8806113c6"} Mar 14 09:26:04 crc kubenswrapper[4886]: I0314 09:26:04.442670 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1642ede75b0b8359960ff24faff90e3e5a5c7608549c51fdade6a8a8806113c6" Mar 14 09:26:04 crc kubenswrapper[4886]: I0314 09:26:04.442415 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558006-5hvlv" Mar 14 09:26:04 crc kubenswrapper[4886]: I0314 09:26:04.899745 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-ck7f2"] Mar 14 09:26:04 crc kubenswrapper[4886]: I0314 09:26:04.912051 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558000-ck7f2"] Mar 14 09:26:05 crc kubenswrapper[4886]: I0314 09:26:05.441883 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cfad98-0ad0-471a-bad2-23a600a2f434" path="/var/lib/kubelet/pods/21cfad98-0ad0-471a-bad2-23a600a2f434/volumes" Mar 14 09:26:13 crc kubenswrapper[4886]: I0314 09:26:13.998180 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4lvw"] Mar 14 09:26:13 crc kubenswrapper[4886]: E0314 09:26:13.999319 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5620551c-b879-4c48-9750-5680d63676ce" containerName="oc" Mar 14 09:26:13 crc kubenswrapper[4886]: I0314 09:26:13.999340 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5620551c-b879-4c48-9750-5680d63676ce" containerName="oc" Mar 14 09:26:13 crc kubenswrapper[4886]: I0314 09:26:13.999563 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5620551c-b879-4c48-9750-5680d63676ce" containerName="oc" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.001180 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.014287 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4lvw"] Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.110317 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-utilities\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.110757 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-catalog-content\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.110805 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7x7b\" (UniqueName: \"kubernetes.io/projected/b5f6ba59-dcc2-4626-a21e-8296cd841050-kube-api-access-w7x7b\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.213164 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-catalog-content\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.213223 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7x7b\" (UniqueName: \"kubernetes.io/projected/b5f6ba59-dcc2-4626-a21e-8296cd841050-kube-api-access-w7x7b\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.213344 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-utilities\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.213957 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-utilities\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.213972 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-catalog-content\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.232195 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7x7b\" (UniqueName: \"kubernetes.io/projected/b5f6ba59-dcc2-4626-a21e-8296cd841050-kube-api-access-w7x7b\") pod \"certified-operators-p4lvw\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.323940 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:14 crc kubenswrapper[4886]: I0314 09:26:14.815042 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4lvw"] Mar 14 09:26:15 crc kubenswrapper[4886]: I0314 09:26:15.568075 4886 generic.go:334] "Generic (PLEG): container finished" podID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerID="eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14" exitCode=0 Mar 14 09:26:15 crc kubenswrapper[4886]: I0314 09:26:15.568158 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4lvw" event={"ID":"b5f6ba59-dcc2-4626-a21e-8296cd841050","Type":"ContainerDied","Data":"eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14"} Mar 14 09:26:15 crc kubenswrapper[4886]: I0314 09:26:15.568449 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4lvw" event={"ID":"b5f6ba59-dcc2-4626-a21e-8296cd841050","Type":"ContainerStarted","Data":"b44d9cffc490c058a8f0464ac24424598efd7605c9000efcc70d161f63ee46c1"} Mar 14 09:26:17 crc kubenswrapper[4886]: I0314 09:26:17.591273 4886 generic.go:334] "Generic (PLEG): container finished" podID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerID="1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7" exitCode=0 Mar 14 09:26:17 crc kubenswrapper[4886]: I0314 09:26:17.591336 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4lvw" event={"ID":"b5f6ba59-dcc2-4626-a21e-8296cd841050","Type":"ContainerDied","Data":"1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7"} Mar 14 09:26:19 crc kubenswrapper[4886]: I0314 09:26:19.614831 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4lvw" event={"ID":"b5f6ba59-dcc2-4626-a21e-8296cd841050","Type":"ContainerStarted","Data":"a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc"} Mar 14 09:26:19 crc kubenswrapper[4886]: I0314 09:26:19.641573 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4lvw" podStartSLOduration=3.6797984169999998 podStartE2EDuration="6.641553759s" podCreationTimestamp="2026-03-14 09:26:13 +0000 UTC" firstStartedPulling="2026-03-14 09:26:15.570334891 +0000 UTC m=+3510.818786528" lastFinishedPulling="2026-03-14 09:26:18.532090233 +0000 UTC m=+3513.780541870" observedRunningTime="2026-03-14 09:26:19.634727434 +0000 UTC m=+3514.883179091" watchObservedRunningTime="2026-03-14 09:26:19.641553759 +0000 UTC m=+3514.890005386" Mar 14 09:26:24 crc kubenswrapper[4886]: I0314 09:26:24.324312 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:24 crc kubenswrapper[4886]: I0314 09:26:24.324814 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:24 crc kubenswrapper[4886]: I0314 09:26:24.383414 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:24 crc kubenswrapper[4886]: I0314 09:26:24.767314 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:24 crc kubenswrapper[4886]: I0314 09:26:24.851192 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4lvw"] Mar 14 09:26:26 crc kubenswrapper[4886]: I0314 09:26:26.695896 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4lvw" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="registry-server" containerID="cri-o://a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc" gracePeriod=2 Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.326850 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.423602 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-utilities\") pod \"b5f6ba59-dcc2-4626-a21e-8296cd841050\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.423689 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-catalog-content\") pod \"b5f6ba59-dcc2-4626-a21e-8296cd841050\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.423762 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7x7b\" (UniqueName: \"kubernetes.io/projected/b5f6ba59-dcc2-4626-a21e-8296cd841050-kube-api-access-w7x7b\") pod \"b5f6ba59-dcc2-4626-a21e-8296cd841050\" (UID: \"b5f6ba59-dcc2-4626-a21e-8296cd841050\") " Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.424994 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-utilities" (OuterVolumeSpecName: "utilities") pod "b5f6ba59-dcc2-4626-a21e-8296cd841050" (UID: "b5f6ba59-dcc2-4626-a21e-8296cd841050"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.443935 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f6ba59-dcc2-4626-a21e-8296cd841050-kube-api-access-w7x7b" (OuterVolumeSpecName: "kube-api-access-w7x7b") pod "b5f6ba59-dcc2-4626-a21e-8296cd841050" (UID: "b5f6ba59-dcc2-4626-a21e-8296cd841050"). InnerVolumeSpecName "kube-api-access-w7x7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.495319 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5f6ba59-dcc2-4626-a21e-8296cd841050" (UID: "b5f6ba59-dcc2-4626-a21e-8296cd841050"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.527201 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.527234 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f6ba59-dcc2-4626-a21e-8296cd841050-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.527245 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7x7b\" (UniqueName: \"kubernetes.io/projected/b5f6ba59-dcc2-4626-a21e-8296cd841050-kube-api-access-w7x7b\") on node \"crc\" DevicePath \"\"" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.706509 4886 generic.go:334] "Generic (PLEG): container finished" podID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerID="a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc" exitCode=0 Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.706556 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4lvw" event={"ID":"b5f6ba59-dcc2-4626-a21e-8296cd841050","Type":"ContainerDied","Data":"a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc"} Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.706569 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4lvw" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.706588 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4lvw" event={"ID":"b5f6ba59-dcc2-4626-a21e-8296cd841050","Type":"ContainerDied","Data":"b44d9cffc490c058a8f0464ac24424598efd7605c9000efcc70d161f63ee46c1"} Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.706608 4886 scope.go:117] "RemoveContainer" containerID="a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.732219 4886 scope.go:117] "RemoveContainer" containerID="1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.755032 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4lvw"] Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.771203 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4lvw"] Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.776320 4886 scope.go:117] "RemoveContainer" containerID="eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.813268 4886 scope.go:117] "RemoveContainer" containerID="a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc" Mar 14 09:26:27 crc kubenswrapper[4886]: E0314 09:26:27.813807 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc\": container with ID starting with a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc not found: ID does not exist" containerID="a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.813869 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc"} err="failed to get container status \"a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc\": rpc error: code = NotFound desc = could not find container \"a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc\": container with ID starting with a85f8fa4acb66eed497676bf8a335b99779da64177acb4e0e7c70756ceb6d2bc not found: ID does not exist" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.813899 4886 scope.go:117] "RemoveContainer" containerID="1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7" Mar 14 09:26:27 crc kubenswrapper[4886]: E0314 09:26:27.814376 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7\": container with ID starting with 1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7 not found: ID does not exist" containerID="1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.814408 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7"} err="failed to get container status \"1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7\": rpc error: code = NotFound desc = could not find container \"1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7\": container with ID starting with 1a780f30aa9b5ebfe3910cef6db23c9d8d80fd9feb1a55e9844ffadc1d8b56d7 not found: ID does not exist" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.814430 4886 scope.go:117] "RemoveContainer" containerID="eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14" Mar 14 09:26:27 crc kubenswrapper[4886]: E0314 09:26:27.814760 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14\": container with ID starting with eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14 not found: ID does not exist" containerID="eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14" Mar 14 09:26:27 crc kubenswrapper[4886]: I0314 09:26:27.814783 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14"} err="failed to get container status \"eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14\": rpc error: code = NotFound desc = could not find container \"eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14\": container with ID starting with eba253053197519f451d87ce929e19d344f1412fcd069ae368e1f3c8f06fbf14 not found: ID does not exist" Mar 14 09:26:29 crc kubenswrapper[4886]: I0314 09:26:29.434282 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" path="/var/lib/kubelet/pods/b5f6ba59-dcc2-4626-a21e-8296cd841050/volumes" Mar 14 09:26:36 crc kubenswrapper[4886]: I0314 09:26:36.197041 4886 scope.go:117] "RemoveContainer" containerID="fe451428c85793e7b9a29a42c1ab49c8425d6d77813a5287b0b77a2fd9e55c5b" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.739722 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pvjk8"] Mar 14 09:27:06 crc kubenswrapper[4886]: E0314 09:27:06.741979 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="registry-server" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.741998 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="registry-server" Mar 14 09:27:06 crc kubenswrapper[4886]: E0314 09:27:06.742011 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="extract-content" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.742017 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="extract-content" Mar 14 09:27:06 crc kubenswrapper[4886]: E0314 09:27:06.742044 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="extract-utilities" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.742050 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="extract-utilities" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.742275 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f6ba59-dcc2-4626-a21e-8296cd841050" containerName="registry-server" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.743641 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.768467 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvjk8"] Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.892666 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-utilities\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.892916 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-catalog-content\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.893073 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc27r\" (UniqueName: \"kubernetes.io/projected/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-kube-api-access-bc27r\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.995972 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-catalog-content\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.996085 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc27r\" (UniqueName: \"kubernetes.io/projected/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-kube-api-access-bc27r\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.996334 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-utilities\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.997045 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-utilities\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:06 crc kubenswrapper[4886]: I0314 09:27:06.997046 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-catalog-content\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:07 crc kubenswrapper[4886]: I0314 09:27:07.023005 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc27r\" (UniqueName: \"kubernetes.io/projected/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-kube-api-access-bc27r\") pod \"redhat-marketplace-pvjk8\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:07 crc kubenswrapper[4886]: I0314 09:27:07.066199 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:07 crc kubenswrapper[4886]: I0314 09:27:07.522564 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvjk8"] Mar 14 09:27:08 crc kubenswrapper[4886]: I0314 09:27:08.101563 4886 generic.go:334] "Generic (PLEG): container finished" podID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerID="d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8" exitCode=0 Mar 14 09:27:08 crc kubenswrapper[4886]: I0314 09:27:08.101646 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerDied","Data":"d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8"} Mar 14 09:27:08 crc kubenswrapper[4886]: I0314 09:27:08.101870 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerStarted","Data":"0def358bc92747b9229680cb0ff40a492c6b999aa24c1698d4583e43a5a492a1"} Mar 14 09:27:08 crc kubenswrapper[4886]: I0314 09:27:08.104815 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:27:09 crc kubenswrapper[4886]: I0314 09:27:09.114504 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerStarted","Data":"b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52"} Mar 14 09:27:10 crc kubenswrapper[4886]: I0314 09:27:10.126289 4886 generic.go:334] "Generic (PLEG): container finished" podID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerID="b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52" exitCode=0 Mar 14 09:27:10 crc kubenswrapper[4886]: I0314 09:27:10.126385 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerDied","Data":"b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52"} Mar 14 09:27:11 crc kubenswrapper[4886]: I0314 09:27:11.187800 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerStarted","Data":"c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881"} Mar 14 09:27:11 crc kubenswrapper[4886]: I0314 09:27:11.264202 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pvjk8" podStartSLOduration=2.655265002 podStartE2EDuration="5.264178853s" podCreationTimestamp="2026-03-14 09:27:06 +0000 UTC" firstStartedPulling="2026-03-14 09:27:08.104624421 +0000 UTC m=+3563.353076058" lastFinishedPulling="2026-03-14 09:27:10.713538272 +0000 UTC m=+3565.961989909" observedRunningTime="2026-03-14 09:27:11.239637933 +0000 UTC m=+3566.488089570" watchObservedRunningTime="2026-03-14 09:27:11.264178853 +0000 UTC m=+3566.512630490" Mar 14 09:27:17 crc kubenswrapper[4886]: I0314 09:27:17.066941 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:17 crc kubenswrapper[4886]: I0314 09:27:17.067536 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:17 crc kubenswrapper[4886]: I0314 09:27:17.116234 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:17 crc kubenswrapper[4886]: I0314 09:27:17.301924 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:18 crc kubenswrapper[4886]: I0314 09:27:18.526634 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvjk8"] Mar 14 09:27:19 crc kubenswrapper[4886]: I0314 09:27:19.418785 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pvjk8" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="registry-server" containerID="cri-o://c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881" gracePeriod=2 Mar 14 09:27:19 crc kubenswrapper[4886]: E0314 09:27:19.734359 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63bcaa9a_7724_4771_b64f_b6a7d5f163b4.slice/crio-conmon-c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63bcaa9a_7724_4771_b64f_b6a7d5f163b4.slice/crio-c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881.scope\": RecentStats: unable to find data in memory cache]" Mar 14 09:27:19 crc kubenswrapper[4886]: I0314 09:27:19.976411 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.103047 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-catalog-content\") pod \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.103094 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-utilities\") pod \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.103230 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc27r\" (UniqueName: \"kubernetes.io/projected/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-kube-api-access-bc27r\") pod \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\" (UID: \"63bcaa9a-7724-4771-b64f-b6a7d5f163b4\") " Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.104075 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-utilities" (OuterVolumeSpecName: "utilities") pod "63bcaa9a-7724-4771-b64f-b6a7d5f163b4" (UID: "63bcaa9a-7724-4771-b64f-b6a7d5f163b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.109456 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-kube-api-access-bc27r" (OuterVolumeSpecName: "kube-api-access-bc27r") pod "63bcaa9a-7724-4771-b64f-b6a7d5f163b4" (UID: "63bcaa9a-7724-4771-b64f-b6a7d5f163b4"). InnerVolumeSpecName "kube-api-access-bc27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.136330 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63bcaa9a-7724-4771-b64f-b6a7d5f163b4" (UID: "63bcaa9a-7724-4771-b64f-b6a7d5f163b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.205584 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc27r\" (UniqueName: \"kubernetes.io/projected/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-kube-api-access-bc27r\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.205631 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.205652 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bcaa9a-7724-4771-b64f-b6a7d5f163b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.430544 4886 generic.go:334] "Generic (PLEG): container finished" podID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerID="c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881" exitCode=0 Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.430587 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerDied","Data":"c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881"} Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.430613 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pvjk8" event={"ID":"63bcaa9a-7724-4771-b64f-b6a7d5f163b4","Type":"ContainerDied","Data":"0def358bc92747b9229680cb0ff40a492c6b999aa24c1698d4583e43a5a492a1"} Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.430659 4886 scope.go:117] "RemoveContainer" containerID="c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.430682 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pvjk8" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.448199 4886 scope.go:117] "RemoveContainer" containerID="b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.473670 4886 scope.go:117] "RemoveContainer" containerID="d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.473728 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvjk8"] Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.488232 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pvjk8"] Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.522769 4886 scope.go:117] "RemoveContainer" containerID="c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881" Mar 14 09:27:20 crc kubenswrapper[4886]: E0314 09:27:20.523345 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881\": container with ID starting with c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881 not found: ID does not exist" containerID="c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.523384 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881"} err="failed to get container status \"c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881\": rpc error: code = NotFound desc = could not find container \"c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881\": container with ID starting with c0e696b0c73da2a15b995f6009ad31e78d9a0c840375ced27c46fba442452881 not found: ID does not exist" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.523409 4886 scope.go:117] "RemoveContainer" containerID="b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52" Mar 14 09:27:20 crc kubenswrapper[4886]: E0314 09:27:20.524011 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52\": container with ID starting with b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52 not found: ID does not exist" containerID="b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.524045 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52"} err="failed to get container status \"b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52\": rpc error: code = NotFound desc = could not find container \"b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52\": container with ID starting with b558089a4bbf20907c2a7a3dd043553a9ed9aba13ce1ad11860aca14ac1adf52 not found: ID does not exist" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.524062 4886 scope.go:117] "RemoveContainer" containerID="d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8" Mar 14 09:27:20 crc kubenswrapper[4886]: E0314 09:27:20.524369 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8\": container with ID starting with d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8 not found: ID does not exist" containerID="d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8" Mar 14 09:27:20 crc kubenswrapper[4886]: I0314 09:27:20.524410 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8"} err="failed to get container status \"d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8\": rpc error: code = NotFound desc = could not find container \"d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8\": container with ID starting with d3e040d876dfadc1e377234b1eabe895c32aeff566c00da18ccc2edf08d335e8 not found: ID does not exist" Mar 14 09:27:21 crc kubenswrapper[4886]: I0314 09:27:21.466093 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" path="/var/lib/kubelet/pods/63bcaa9a-7724-4771-b64f-b6a7d5f163b4/volumes" Mar 14 09:27:26 crc kubenswrapper[4886]: I0314 09:27:26.066380 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:27:26 crc kubenswrapper[4886]: I0314 09:27:26.066935 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:27:56 crc kubenswrapper[4886]: I0314 09:27:56.066404 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:27:56 crc kubenswrapper[4886]: I0314 09:27:56.066941 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.174156 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558008-7527v"] Mar 14 09:28:00 crc kubenswrapper[4886]: E0314 09:28:00.177147 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="registry-server" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.177183 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="registry-server" Mar 14 09:28:00 crc kubenswrapper[4886]: E0314 09:28:00.177213 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="extract-content" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.177221 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="extract-content" Mar 14 09:28:00 crc kubenswrapper[4886]: E0314 09:28:00.177266 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="extract-utilities" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.177275 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="extract-utilities" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.177636 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bcaa9a-7724-4771-b64f-b6a7d5f163b4" containerName="registry-server" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.179137 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.183325 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.183863 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.184511 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.194621 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-7527v"] Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.204206 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-775pt\" (UniqueName: \"kubernetes.io/projected/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486-kube-api-access-775pt\") pod \"auto-csr-approver-29558008-7527v\" (UID: \"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486\") " pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.306948 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-775pt\" (UniqueName: \"kubernetes.io/projected/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486-kube-api-access-775pt\") pod \"auto-csr-approver-29558008-7527v\" (UID: \"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486\") " pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.334922 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-775pt\" (UniqueName: \"kubernetes.io/projected/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486-kube-api-access-775pt\") pod \"auto-csr-approver-29558008-7527v\" (UID: \"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486\") " pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.518702 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:00 crc kubenswrapper[4886]: I0314 09:28:00.991968 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-7527v"] Mar 14 09:28:01 crc kubenswrapper[4886]: I0314 09:28:01.587967 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-7527v" event={"ID":"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486","Type":"ContainerStarted","Data":"dac50227b716a0d0910d9d4fbbaec47ca0a799339b0f0875fbea41192c55e86d"} Mar 14 09:28:02 crc kubenswrapper[4886]: I0314 09:28:02.600497 4886 generic.go:334] "Generic (PLEG): container finished" podID="4b385e7a-cf4d-46de-bbbe-6e75d9d2c486" containerID="021cecdb51dc5ca43366b33e1d285bbe2dad3f6135197d3662197dbc315daa4b" exitCode=0 Mar 14 09:28:02 crc kubenswrapper[4886]: I0314 09:28:02.600553 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-7527v" event={"ID":"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486","Type":"ContainerDied","Data":"021cecdb51dc5ca43366b33e1d285bbe2dad3f6135197d3662197dbc315daa4b"} Mar 14 09:28:03 crc kubenswrapper[4886]: I0314 09:28:03.986865 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:04 crc kubenswrapper[4886]: I0314 09:28:04.078803 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-775pt\" (UniqueName: \"kubernetes.io/projected/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486-kube-api-access-775pt\") pod \"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486\" (UID: \"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486\") " Mar 14 09:28:04 crc kubenswrapper[4886]: I0314 09:28:04.086604 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486-kube-api-access-775pt" (OuterVolumeSpecName: "kube-api-access-775pt") pod "4b385e7a-cf4d-46de-bbbe-6e75d9d2c486" (UID: "4b385e7a-cf4d-46de-bbbe-6e75d9d2c486"). InnerVolumeSpecName "kube-api-access-775pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:28:04 crc kubenswrapper[4886]: I0314 09:28:04.186127 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-775pt\" (UniqueName: \"kubernetes.io/projected/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486-kube-api-access-775pt\") on node \"crc\" DevicePath \"\"" Mar 14 09:28:04 crc kubenswrapper[4886]: I0314 09:28:04.620920 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558008-7527v" event={"ID":"4b385e7a-cf4d-46de-bbbe-6e75d9d2c486","Type":"ContainerDied","Data":"dac50227b716a0d0910d9d4fbbaec47ca0a799339b0f0875fbea41192c55e86d"} Mar 14 09:28:04 crc kubenswrapper[4886]: I0314 09:28:04.620969 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dac50227b716a0d0910d9d4fbbaec47ca0a799339b0f0875fbea41192c55e86d" Mar 14 09:28:04 crc kubenswrapper[4886]: I0314 09:28:04.620979 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558008-7527v" Mar 14 09:28:05 crc kubenswrapper[4886]: I0314 09:28:05.058452 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-gk6tg"] Mar 14 09:28:05 crc kubenswrapper[4886]: I0314 09:28:05.071523 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558002-gk6tg"] Mar 14 09:28:05 crc kubenswrapper[4886]: I0314 09:28:05.433294 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29680d59-2279-4f28-aba6-b3e41fd622e1" path="/var/lib/kubelet/pods/29680d59-2279-4f28-aba6-b3e41fd622e1/volumes" Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.066542 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.067341 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.067407 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.068472 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.068608 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" gracePeriod=600 Mar 14 09:28:26 crc kubenswrapper[4886]: E0314 09:28:26.196240 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.843137 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" exitCode=0 Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.843268 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee"} Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.843498 4886 scope.go:117] "RemoveContainer" containerID="cd6387aefbce165fc3b494f0eeffc43316107d6f7697eadc4f4e5177687d48f9" Mar 14 09:28:26 crc kubenswrapper[4886]: I0314 09:28:26.844608 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:28:26 crc kubenswrapper[4886]: E0314 09:28:26.845107 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:28:36 crc kubenswrapper[4886]: I0314 09:28:36.320101 4886 scope.go:117] "RemoveContainer" containerID="818e2f6fa2781d837b946858def1009e991e3b3bef64ccec5ade2343bbf6f83e" Mar 14 09:28:38 crc kubenswrapper[4886]: I0314 09:28:38.420639 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:28:38 crc kubenswrapper[4886]: E0314 09:28:38.421249 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:28:53 crc kubenswrapper[4886]: I0314 09:28:53.421716 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:28:53 crc kubenswrapper[4886]: E0314 09:28:53.422554 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:29:06 crc kubenswrapper[4886]: I0314 09:29:06.421751 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:29:06 crc kubenswrapper[4886]: E0314 09:29:06.423178 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:29:20 crc kubenswrapper[4886]: I0314 09:29:20.421652 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:29:20 crc kubenswrapper[4886]: E0314 09:29:20.422530 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:29:34 crc kubenswrapper[4886]: I0314 09:29:34.421736 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:29:34 crc kubenswrapper[4886]: E0314 09:29:34.424422 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:29:46 crc kubenswrapper[4886]: I0314 09:29:46.421646 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:29:46 crc kubenswrapper[4886]: E0314 09:29:46.425099 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:29:59 crc kubenswrapper[4886]: I0314 09:29:59.420948 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:29:59 crc kubenswrapper[4886]: E0314 09:29:59.421567 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.148664 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9"] Mar 14 09:30:00 crc kubenswrapper[4886]: E0314 09:30:00.149496 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b385e7a-cf4d-46de-bbbe-6e75d9d2c486" containerName="oc" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.149577 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b385e7a-cf4d-46de-bbbe-6e75d9d2c486" containerName="oc" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.149884 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b385e7a-cf4d-46de-bbbe-6e75d9d2c486" containerName="oc" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.150813 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.153192 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.153200 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.157863 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558010-jvk9f"] Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.159427 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.161503 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.163053 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.163230 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.166462 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-jvk9f"] Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.175239 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9"] Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.265251 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eeeacffc-8382-4b9a-a039-9e046c648179-secret-volume\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.265396 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeeacffc-8382-4b9a-a039-9e046c648179-config-volume\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.265438 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdz4s\" (UniqueName: \"kubernetes.io/projected/eeeacffc-8382-4b9a-a039-9e046c648179-kube-api-access-tdz4s\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.265547 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s89wz\" (UniqueName: \"kubernetes.io/projected/8005d191-77ad-4c55-9a44-adbe02e3b176-kube-api-access-s89wz\") pod \"auto-csr-approver-29558010-jvk9f\" (UID: \"8005d191-77ad-4c55-9a44-adbe02e3b176\") " pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.367416 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s89wz\" (UniqueName: \"kubernetes.io/projected/8005d191-77ad-4c55-9a44-adbe02e3b176-kube-api-access-s89wz\") pod \"auto-csr-approver-29558010-jvk9f\" (UID: \"8005d191-77ad-4c55-9a44-adbe02e3b176\") " pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.367481 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eeeacffc-8382-4b9a-a039-9e046c648179-secret-volume\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.367576 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeeacffc-8382-4b9a-a039-9e046c648179-config-volume\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.367615 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdz4s\" (UniqueName: \"kubernetes.io/projected/eeeacffc-8382-4b9a-a039-9e046c648179-kube-api-access-tdz4s\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.368876 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeeacffc-8382-4b9a-a039-9e046c648179-config-volume\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.374007 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eeeacffc-8382-4b9a-a039-9e046c648179-secret-volume\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.383847 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdz4s\" (UniqueName: \"kubernetes.io/projected/eeeacffc-8382-4b9a-a039-9e046c648179-kube-api-access-tdz4s\") pod \"collect-profiles-29558010-8v8b9\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.384147 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s89wz\" (UniqueName: \"kubernetes.io/projected/8005d191-77ad-4c55-9a44-adbe02e3b176-kube-api-access-s89wz\") pod \"auto-csr-approver-29558010-jvk9f\" (UID: \"8005d191-77ad-4c55-9a44-adbe02e3b176\") " pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.475190 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.492561 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:00 crc kubenswrapper[4886]: I0314 09:30:00.951048 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9"] Mar 14 09:30:01 crc kubenswrapper[4886]: I0314 09:30:01.002567 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-jvk9f"] Mar 14 09:30:01 crc kubenswrapper[4886]: I0314 09:30:01.816932 4886 generic.go:334] "Generic (PLEG): container finished" podID="eeeacffc-8382-4b9a-a039-9e046c648179" containerID="b1cc94108878d35dce56c00801ef7d9f044f5f82c0ba590d867aa4dc780aaec5" exitCode=0 Mar 14 09:30:01 crc kubenswrapper[4886]: I0314 09:30:01.816973 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" event={"ID":"eeeacffc-8382-4b9a-a039-9e046c648179","Type":"ContainerDied","Data":"b1cc94108878d35dce56c00801ef7d9f044f5f82c0ba590d867aa4dc780aaec5"} Mar 14 09:30:01 crc kubenswrapper[4886]: I0314 09:30:01.817307 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" event={"ID":"eeeacffc-8382-4b9a-a039-9e046c648179","Type":"ContainerStarted","Data":"2efd3fea5e2f9267239e297a6721e09f531072c6c59ae70d7e9002a8741fcfe6"} Mar 14 09:30:01 crc kubenswrapper[4886]: I0314 09:30:01.818722 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" event={"ID":"8005d191-77ad-4c55-9a44-adbe02e3b176","Type":"ContainerStarted","Data":"612156f3eda2fc27ab87f67a3c8495c09b086dc831dff6c29d59d3b04eb5e61f"} Mar 14 09:30:02 crc kubenswrapper[4886]: I0314 09:30:02.830093 4886 generic.go:334] "Generic (PLEG): container finished" podID="8005d191-77ad-4c55-9a44-adbe02e3b176" containerID="6e6d82e2ee3e29d006029269d60cb9f62957d66749f913cfb5daa079a558e6ac" exitCode=0 Mar 14 09:30:02 crc kubenswrapper[4886]: I0314 09:30:02.830160 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" event={"ID":"8005d191-77ad-4c55-9a44-adbe02e3b176","Type":"ContainerDied","Data":"6e6d82e2ee3e29d006029269d60cb9f62957d66749f913cfb5daa079a558e6ac"} Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.192047 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.332357 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eeeacffc-8382-4b9a-a039-9e046c648179-secret-volume\") pod \"eeeacffc-8382-4b9a-a039-9e046c648179\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.332475 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdz4s\" (UniqueName: \"kubernetes.io/projected/eeeacffc-8382-4b9a-a039-9e046c648179-kube-api-access-tdz4s\") pod \"eeeacffc-8382-4b9a-a039-9e046c648179\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.332535 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeeacffc-8382-4b9a-a039-9e046c648179-config-volume\") pod \"eeeacffc-8382-4b9a-a039-9e046c648179\" (UID: \"eeeacffc-8382-4b9a-a039-9e046c648179\") " Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.333336 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeeacffc-8382-4b9a-a039-9e046c648179-config-volume" (OuterVolumeSpecName: "config-volume") pod "eeeacffc-8382-4b9a-a039-9e046c648179" (UID: "eeeacffc-8382-4b9a-a039-9e046c648179"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.338063 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeeacffc-8382-4b9a-a039-9e046c648179-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eeeacffc-8382-4b9a-a039-9e046c648179" (UID: "eeeacffc-8382-4b9a-a039-9e046c648179"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.351515 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeeacffc-8382-4b9a-a039-9e046c648179-kube-api-access-tdz4s" (OuterVolumeSpecName: "kube-api-access-tdz4s") pod "eeeacffc-8382-4b9a-a039-9e046c648179" (UID: "eeeacffc-8382-4b9a-a039-9e046c648179"). InnerVolumeSpecName "kube-api-access-tdz4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.434837 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eeeacffc-8382-4b9a-a039-9e046c648179-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.434870 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdz4s\" (UniqueName: \"kubernetes.io/projected/eeeacffc-8382-4b9a-a039-9e046c648179-kube-api-access-tdz4s\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.434881 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eeeacffc-8382-4b9a-a039-9e046c648179-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.839287 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" event={"ID":"eeeacffc-8382-4b9a-a039-9e046c648179","Type":"ContainerDied","Data":"2efd3fea5e2f9267239e297a6721e09f531072c6c59ae70d7e9002a8741fcfe6"} Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.839339 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2efd3fea5e2f9267239e297a6721e09f531072c6c59ae70d7e9002a8741fcfe6" Mar 14 09:30:03 crc kubenswrapper[4886]: I0314 09:30:03.839375 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558010-8v8b9" Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.219505 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.291673 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2"] Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.299945 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557965-8h4j2"] Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.355000 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s89wz\" (UniqueName: \"kubernetes.io/projected/8005d191-77ad-4c55-9a44-adbe02e3b176-kube-api-access-s89wz\") pod \"8005d191-77ad-4c55-9a44-adbe02e3b176\" (UID: \"8005d191-77ad-4c55-9a44-adbe02e3b176\") " Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.367395 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8005d191-77ad-4c55-9a44-adbe02e3b176-kube-api-access-s89wz" (OuterVolumeSpecName: "kube-api-access-s89wz") pod "8005d191-77ad-4c55-9a44-adbe02e3b176" (UID: "8005d191-77ad-4c55-9a44-adbe02e3b176"). InnerVolumeSpecName "kube-api-access-s89wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.457747 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s89wz\" (UniqueName: \"kubernetes.io/projected/8005d191-77ad-4c55-9a44-adbe02e3b176-kube-api-access-s89wz\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.848771 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" event={"ID":"8005d191-77ad-4c55-9a44-adbe02e3b176","Type":"ContainerDied","Data":"612156f3eda2fc27ab87f67a3c8495c09b086dc831dff6c29d59d3b04eb5e61f"} Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.848810 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612156f3eda2fc27ab87f67a3c8495c09b086dc831dff6c29d59d3b04eb5e61f" Mar 14 09:30:04 crc kubenswrapper[4886]: I0314 09:30:04.848849 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558010-jvk9f" Mar 14 09:30:05 crc kubenswrapper[4886]: I0314 09:30:05.294505 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-4schh"] Mar 14 09:30:05 crc kubenswrapper[4886]: I0314 09:30:05.306833 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558004-4schh"] Mar 14 09:30:05 crc kubenswrapper[4886]: I0314 09:30:05.450510 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b443cc-0bf1-4fa4-890d-60a52d4c14b3" path="/var/lib/kubelet/pods/46b443cc-0bf1-4fa4-890d-60a52d4c14b3/volumes" Mar 14 09:30:05 crc kubenswrapper[4886]: I0314 09:30:05.452890 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eecaac41-bd76-496a-bf1c-61c2ee287386" path="/var/lib/kubelet/pods/eecaac41-bd76-496a-bf1c-61c2ee287386/volumes" Mar 14 09:30:10 crc kubenswrapper[4886]: I0314 09:30:10.421052 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:30:10 crc kubenswrapper[4886]: E0314 09:30:10.421826 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.861072 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ffnqs"] Mar 14 09:30:13 crc kubenswrapper[4886]: E0314 09:30:13.861817 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeeacffc-8382-4b9a-a039-9e046c648179" containerName="collect-profiles" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.861834 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeeacffc-8382-4b9a-a039-9e046c648179" containerName="collect-profiles" Mar 14 09:30:13 crc kubenswrapper[4886]: E0314 09:30:13.861877 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005d191-77ad-4c55-9a44-adbe02e3b176" containerName="oc" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.861885 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005d191-77ad-4c55-9a44-adbe02e3b176" containerName="oc" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.862159 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8005d191-77ad-4c55-9a44-adbe02e3b176" containerName="oc" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.862176 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeeacffc-8382-4b9a-a039-9e046c648179" containerName="collect-profiles" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.863896 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.885065 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffnqs"] Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.967886 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-utilities\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.967936 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdt7z\" (UniqueName: \"kubernetes.io/projected/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-kube-api-access-fdt7z\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:13 crc kubenswrapper[4886]: I0314 09:30:13.968104 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-catalog-content\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.069524 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-catalog-content\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.069696 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-utilities\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.069716 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdt7z\" (UniqueName: \"kubernetes.io/projected/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-kube-api-access-fdt7z\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.070275 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-utilities\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.070682 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-catalog-content\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.092036 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdt7z\" (UniqueName: \"kubernetes.io/projected/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-kube-api-access-fdt7z\") pod \"redhat-operators-ffnqs\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.200009 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.706857 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffnqs"] Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.942338 4886 generic.go:334] "Generic (PLEG): container finished" podID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerID="94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f" exitCode=0 Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.942387 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerDied","Data":"94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f"} Mar 14 09:30:14 crc kubenswrapper[4886]: I0314 09:30:14.942418 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerStarted","Data":"2c199ef5a9cecb231d80edcca9c718bf7a9aa32bd06f32043d1f4d7edb5bbded"} Mar 14 09:30:15 crc kubenswrapper[4886]: I0314 09:30:15.955811 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerStarted","Data":"7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492"} Mar 14 09:30:21 crc kubenswrapper[4886]: I0314 09:30:21.036973 4886 generic.go:334] "Generic (PLEG): container finished" podID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerID="7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492" exitCode=0 Mar 14 09:30:21 crc kubenswrapper[4886]: I0314 09:30:21.037066 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerDied","Data":"7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492"} Mar 14 09:30:22 crc kubenswrapper[4886]: I0314 09:30:22.049166 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerStarted","Data":"5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3"} Mar 14 09:30:22 crc kubenswrapper[4886]: I0314 09:30:22.082593 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ffnqs" podStartSLOduration=2.525323451 podStartE2EDuration="9.082572317s" podCreationTimestamp="2026-03-14 09:30:13 +0000 UTC" firstStartedPulling="2026-03-14 09:30:14.943916432 +0000 UTC m=+3750.192368069" lastFinishedPulling="2026-03-14 09:30:21.501165298 +0000 UTC m=+3756.749616935" observedRunningTime="2026-03-14 09:30:22.067552938 +0000 UTC m=+3757.316004575" watchObservedRunningTime="2026-03-14 09:30:22.082572317 +0000 UTC m=+3757.331023954" Mar 14 09:30:23 crc kubenswrapper[4886]: I0314 09:30:23.421443 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:30:23 crc kubenswrapper[4886]: E0314 09:30:23.422528 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:30:24 crc kubenswrapper[4886]: I0314 09:30:24.200606 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:24 crc kubenswrapper[4886]: I0314 09:30:24.200804 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:25 crc kubenswrapper[4886]: I0314 09:30:25.244896 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ffnqs" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="registry-server" probeResult="failure" output=< Mar 14 09:30:25 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:30:25 crc kubenswrapper[4886]: > Mar 14 09:30:34 crc kubenswrapper[4886]: I0314 09:30:34.265897 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:34 crc kubenswrapper[4886]: I0314 09:30:34.317323 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:34 crc kubenswrapper[4886]: I0314 09:30:34.505173 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffnqs"] Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.236942 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ffnqs" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="registry-server" containerID="cri-o://5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3" gracePeriod=2 Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.424204 4886 scope.go:117] "RemoveContainer" containerID="88b3a786c5e5457174b81479bd5672a9f30aeb3474291ee632512ec99043a3f1" Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.470351 4886 scope.go:117] "RemoveContainer" containerID="a41ddcda531dba69cb7ba0aba03a52c9f225efc67069e6e4218361bf4e474a2a" Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.815763 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.867577 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-utilities\") pod \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.867792 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdt7z\" (UniqueName: \"kubernetes.io/projected/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-kube-api-access-fdt7z\") pod \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.868036 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-catalog-content\") pod \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\" (UID: \"8fbdc5fb-cf4e-4c92-b997-bd3570c96669\") " Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.868564 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-utilities" (OuterVolumeSpecName: "utilities") pod "8fbdc5fb-cf4e-4c92-b997-bd3570c96669" (UID: "8fbdc5fb-cf4e-4c92-b997-bd3570c96669"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.873013 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.875047 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-kube-api-access-fdt7z" (OuterVolumeSpecName: "kube-api-access-fdt7z") pod "8fbdc5fb-cf4e-4c92-b997-bd3570c96669" (UID: "8fbdc5fb-cf4e-4c92-b997-bd3570c96669"). InnerVolumeSpecName "kube-api-access-fdt7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:36 crc kubenswrapper[4886]: I0314 09:30:36.974931 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdt7z\" (UniqueName: \"kubernetes.io/projected/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-kube-api-access-fdt7z\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.016142 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fbdc5fb-cf4e-4c92-b997-bd3570c96669" (UID: "8fbdc5fb-cf4e-4c92-b997-bd3570c96669"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.077379 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbdc5fb-cf4e-4c92-b997-bd3570c96669-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.248560 4886 generic.go:334] "Generic (PLEG): container finished" podID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerID="5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3" exitCode=0 Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.248609 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerDied","Data":"5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3"} Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.248637 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffnqs" event={"ID":"8fbdc5fb-cf4e-4c92-b997-bd3570c96669","Type":"ContainerDied","Data":"2c199ef5a9cecb231d80edcca9c718bf7a9aa32bd06f32043d1f4d7edb5bbded"} Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.248661 4886 scope.go:117] "RemoveContainer" containerID="5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.249863 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffnqs" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.272140 4886 scope.go:117] "RemoveContainer" containerID="7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.291072 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffnqs"] Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.301883 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ffnqs"] Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.313903 4886 scope.go:117] "RemoveContainer" containerID="94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.333882 4886 scope.go:117] "RemoveContainer" containerID="5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3" Mar 14 09:30:37 crc kubenswrapper[4886]: E0314 09:30:37.334366 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3\": container with ID starting with 5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3 not found: ID does not exist" containerID="5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.334402 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3"} err="failed to get container status \"5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3\": rpc error: code = NotFound desc = could not find container \"5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3\": container with ID starting with 5c8145ba093ad51346b23fdd8bf616707f8c5e933bae18b8c9f84914fb4716d3 not found: ID does not exist" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.334429 4886 scope.go:117] "RemoveContainer" containerID="7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492" Mar 14 09:30:37 crc kubenswrapper[4886]: E0314 09:30:37.334795 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492\": container with ID starting with 7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492 not found: ID does not exist" containerID="7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.334892 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492"} err="failed to get container status \"7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492\": rpc error: code = NotFound desc = could not find container \"7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492\": container with ID starting with 7331d4180f464dd573ee0fd13061e111ec25466b51d678a5332a1f2f11412492 not found: ID does not exist" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.334968 4886 scope.go:117] "RemoveContainer" containerID="94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f" Mar 14 09:30:37 crc kubenswrapper[4886]: E0314 09:30:37.335436 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f\": container with ID starting with 94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f not found: ID does not exist" containerID="94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.335536 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f"} err="failed to get container status \"94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f\": rpc error: code = NotFound desc = could not find container \"94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f\": container with ID starting with 94d8ef058d12af56affe922f11058cd7bf7bb31ecf4d90a007d57e3409a2d90f not found: ID does not exist" Mar 14 09:30:37 crc kubenswrapper[4886]: I0314 09:30:37.444919 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" path="/var/lib/kubelet/pods/8fbdc5fb-cf4e-4c92-b997-bd3570c96669/volumes" Mar 14 09:30:38 crc kubenswrapper[4886]: I0314 09:30:38.421894 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:30:38 crc kubenswrapper[4886]: E0314 09:30:38.422370 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.597450 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v5xxf"] Mar 14 09:30:45 crc kubenswrapper[4886]: E0314 09:30:45.598423 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="extract-content" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.598436 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="extract-content" Mar 14 09:30:45 crc kubenswrapper[4886]: E0314 09:30:45.598485 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="extract-utilities" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.598493 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="extract-utilities" Mar 14 09:30:45 crc kubenswrapper[4886]: E0314 09:30:45.598502 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="registry-server" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.598510 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="registry-server" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.598719 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbdc5fb-cf4e-4c92-b997-bd3570c96669" containerName="registry-server" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.600312 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.611951 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5xxf"] Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.663442 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-utilities\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.663555 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7ptx\" (UniqueName: \"kubernetes.io/projected/9fe68208-38ff-4a65-8759-95b7e79ee5df-kube-api-access-j7ptx\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.663612 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-catalog-content\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.765798 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-utilities\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.765923 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7ptx\" (UniqueName: \"kubernetes.io/projected/9fe68208-38ff-4a65-8759-95b7e79ee5df-kube-api-access-j7ptx\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.765994 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-catalog-content\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.766570 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-catalog-content\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.766872 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-utilities\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.785150 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7ptx\" (UniqueName: \"kubernetes.io/projected/9fe68208-38ff-4a65-8759-95b7e79ee5df-kube-api-access-j7ptx\") pod \"community-operators-v5xxf\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:45 crc kubenswrapper[4886]: I0314 09:30:45.936169 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:46 crc kubenswrapper[4886]: I0314 09:30:46.509491 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v5xxf"] Mar 14 09:30:46 crc kubenswrapper[4886]: W0314 09:30:46.514078 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe68208_38ff_4a65_8759_95b7e79ee5df.slice/crio-c08dc1f4006dfc808dae9c2829b98bc16363bca3b6e155545e603ea0f12d9351 WatchSource:0}: Error finding container c08dc1f4006dfc808dae9c2829b98bc16363bca3b6e155545e603ea0f12d9351: Status 404 returned error can't find the container with id c08dc1f4006dfc808dae9c2829b98bc16363bca3b6e155545e603ea0f12d9351 Mar 14 09:30:47 crc kubenswrapper[4886]: I0314 09:30:47.333103 4886 generic.go:334] "Generic (PLEG): container finished" podID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerID="4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90" exitCode=0 Mar 14 09:30:47 crc kubenswrapper[4886]: I0314 09:30:47.333393 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerDied","Data":"4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90"} Mar 14 09:30:47 crc kubenswrapper[4886]: I0314 09:30:47.333421 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerStarted","Data":"c08dc1f4006dfc808dae9c2829b98bc16363bca3b6e155545e603ea0f12d9351"} Mar 14 09:30:48 crc kubenswrapper[4886]: I0314 09:30:48.343635 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerStarted","Data":"3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24"} Mar 14 09:30:50 crc kubenswrapper[4886]: I0314 09:30:50.381469 4886 generic.go:334] "Generic (PLEG): container finished" podID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerID="3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24" exitCode=0 Mar 14 09:30:50 crc kubenswrapper[4886]: I0314 09:30:50.381571 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerDied","Data":"3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24"} Mar 14 09:30:50 crc kubenswrapper[4886]: I0314 09:30:50.421201 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:30:50 crc kubenswrapper[4886]: E0314 09:30:50.421715 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:30:51 crc kubenswrapper[4886]: I0314 09:30:51.393392 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerStarted","Data":"12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057"} Mar 14 09:30:55 crc kubenswrapper[4886]: I0314 09:30:55.936883 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:55 crc kubenswrapper[4886]: I0314 09:30:55.937410 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:55 crc kubenswrapper[4886]: I0314 09:30:55.988717 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:56 crc kubenswrapper[4886]: I0314 09:30:56.008155 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v5xxf" podStartSLOduration=7.549050888 podStartE2EDuration="11.008137752s" podCreationTimestamp="2026-03-14 09:30:45 +0000 UTC" firstStartedPulling="2026-03-14 09:30:47.334791876 +0000 UTC m=+3782.583243513" lastFinishedPulling="2026-03-14 09:30:50.79387875 +0000 UTC m=+3786.042330377" observedRunningTime="2026-03-14 09:30:51.412980294 +0000 UTC m=+3786.661431931" watchObservedRunningTime="2026-03-14 09:30:56.008137752 +0000 UTC m=+3791.256589389" Mar 14 09:30:56 crc kubenswrapper[4886]: I0314 09:30:56.499184 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:56 crc kubenswrapper[4886]: I0314 09:30:56.558386 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5xxf"] Mar 14 09:30:58 crc kubenswrapper[4886]: I0314 09:30:58.451839 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v5xxf" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="registry-server" containerID="cri-o://12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057" gracePeriod=2 Mar 14 09:30:58 crc kubenswrapper[4886]: I0314 09:30:58.976694 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.034140 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7ptx\" (UniqueName: \"kubernetes.io/projected/9fe68208-38ff-4a65-8759-95b7e79ee5df-kube-api-access-j7ptx\") pod \"9fe68208-38ff-4a65-8759-95b7e79ee5df\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.034233 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-utilities\") pod \"9fe68208-38ff-4a65-8759-95b7e79ee5df\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.034458 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-catalog-content\") pod \"9fe68208-38ff-4a65-8759-95b7e79ee5df\" (UID: \"9fe68208-38ff-4a65-8759-95b7e79ee5df\") " Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.035279 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-utilities" (OuterVolumeSpecName: "utilities") pod "9fe68208-38ff-4a65-8759-95b7e79ee5df" (UID: "9fe68208-38ff-4a65-8759-95b7e79ee5df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.041009 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fe68208-38ff-4a65-8759-95b7e79ee5df-kube-api-access-j7ptx" (OuterVolumeSpecName: "kube-api-access-j7ptx") pod "9fe68208-38ff-4a65-8759-95b7e79ee5df" (UID: "9fe68208-38ff-4a65-8759-95b7e79ee5df"). InnerVolumeSpecName "kube-api-access-j7ptx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.136919 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7ptx\" (UniqueName: \"kubernetes.io/projected/9fe68208-38ff-4a65-8759-95b7e79ee5df-kube-api-access-j7ptx\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.136959 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.464592 4886 generic.go:334] "Generic (PLEG): container finished" podID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerID="12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057" exitCode=0 Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.464639 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerDied","Data":"12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057"} Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.464673 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v5xxf" event={"ID":"9fe68208-38ff-4a65-8759-95b7e79ee5df","Type":"ContainerDied","Data":"c08dc1f4006dfc808dae9c2829b98bc16363bca3b6e155545e603ea0f12d9351"} Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.464693 4886 scope.go:117] "RemoveContainer" containerID="12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.464733 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v5xxf" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.485380 4886 scope.go:117] "RemoveContainer" containerID="3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.519865 4886 scope.go:117] "RemoveContainer" containerID="4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.547859 4886 scope.go:117] "RemoveContainer" containerID="12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057" Mar 14 09:30:59 crc kubenswrapper[4886]: E0314 09:30:59.548930 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057\": container with ID starting with 12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057 not found: ID does not exist" containerID="12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.548989 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057"} err="failed to get container status \"12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057\": rpc error: code = NotFound desc = could not find container \"12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057\": container with ID starting with 12d5c070a7f0561f74fd5853cd9ed4bc5d74f6c8aae992e4079a6f5a11247057 not found: ID does not exist" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.549031 4886 scope.go:117] "RemoveContainer" containerID="3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24" Mar 14 09:30:59 crc kubenswrapper[4886]: E0314 09:30:59.549441 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24\": container with ID starting with 3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24 not found: ID does not exist" containerID="3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.549473 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24"} err="failed to get container status \"3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24\": rpc error: code = NotFound desc = could not find container \"3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24\": container with ID starting with 3939f51634891023699e063823c1471ad5301ddf1d2f5e1f1fd59861c4cbda24 not found: ID does not exist" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.549494 4886 scope.go:117] "RemoveContainer" containerID="4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90" Mar 14 09:30:59 crc kubenswrapper[4886]: E0314 09:30:59.549733 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90\": container with ID starting with 4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90 not found: ID does not exist" containerID="4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.549781 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90"} err="failed to get container status \"4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90\": rpc error: code = NotFound desc = could not find container \"4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90\": container with ID starting with 4cc510ee5ffa20d102d3be32137a6f5d85e515702792aff188e91c4d11978b90 not found: ID does not exist" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.634951 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fe68208-38ff-4a65-8759-95b7e79ee5df" (UID: "9fe68208-38ff-4a65-8759-95b7e79ee5df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.648338 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fe68208-38ff-4a65-8759-95b7e79ee5df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.798066 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v5xxf"] Mar 14 09:30:59 crc kubenswrapper[4886]: I0314 09:30:59.807356 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v5xxf"] Mar 14 09:31:01 crc kubenswrapper[4886]: I0314 09:31:01.432709 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" path="/var/lib/kubelet/pods/9fe68208-38ff-4a65-8759-95b7e79ee5df/volumes" Mar 14 09:31:02 crc kubenswrapper[4886]: I0314 09:31:02.420725 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:31:02 crc kubenswrapper[4886]: E0314 09:31:02.420976 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:31:14 crc kubenswrapper[4886]: I0314 09:31:14.436791 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:31:14 crc kubenswrapper[4886]: E0314 09:31:14.438631 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:31:29 crc kubenswrapper[4886]: I0314 09:31:29.421392 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:31:29 crc kubenswrapper[4886]: E0314 09:31:29.422175 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:31:42 crc kubenswrapper[4886]: I0314 09:31:42.420535 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:31:42 crc kubenswrapper[4886]: E0314 09:31:42.421145 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:31:55 crc kubenswrapper[4886]: I0314 09:31:55.427267 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:31:55 crc kubenswrapper[4886]: E0314 09:31:55.428095 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.145987 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558012-wmtwb"] Mar 14 09:32:00 crc kubenswrapper[4886]: E0314 09:32:00.147866 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="registry-server" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.147947 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="registry-server" Mar 14 09:32:00 crc kubenswrapper[4886]: E0314 09:32:00.148049 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="extract-utilities" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.148106 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="extract-utilities" Mar 14 09:32:00 crc kubenswrapper[4886]: E0314 09:32:00.148196 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="extract-content" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.148252 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="extract-content" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.148519 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fe68208-38ff-4a65-8759-95b7e79ee5df" containerName="registry-server" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.149305 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.151469 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.152113 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.152210 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.163361 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-wmtwb"] Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.265079 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbk2v\" (UniqueName: \"kubernetes.io/projected/a4dcb604-bc45-46f3-8ee2-756f85d73578-kube-api-access-jbk2v\") pod \"auto-csr-approver-29558012-wmtwb\" (UID: \"a4dcb604-bc45-46f3-8ee2-756f85d73578\") " pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.367463 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbk2v\" (UniqueName: \"kubernetes.io/projected/a4dcb604-bc45-46f3-8ee2-756f85d73578-kube-api-access-jbk2v\") pod \"auto-csr-approver-29558012-wmtwb\" (UID: \"a4dcb604-bc45-46f3-8ee2-756f85d73578\") " pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.385395 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbk2v\" (UniqueName: \"kubernetes.io/projected/a4dcb604-bc45-46f3-8ee2-756f85d73578-kube-api-access-jbk2v\") pod \"auto-csr-approver-29558012-wmtwb\" (UID: \"a4dcb604-bc45-46f3-8ee2-756f85d73578\") " pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.469711 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:00 crc kubenswrapper[4886]: I0314 09:32:00.968774 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-wmtwb"] Mar 14 09:32:01 crc kubenswrapper[4886]: I0314 09:32:01.042813 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" event={"ID":"a4dcb604-bc45-46f3-8ee2-756f85d73578","Type":"ContainerStarted","Data":"20a0350d491b24d521eb229b40b3207583b07479e03b1459e24b9ebfe53498a7"} Mar 14 09:32:03 crc kubenswrapper[4886]: I0314 09:32:03.060092 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" event={"ID":"a4dcb604-bc45-46f3-8ee2-756f85d73578","Type":"ContainerStarted","Data":"f710aa97aa39d3693b0a54f9550a6705ac43b4e4b5494d27c8627f81ffb95267"} Mar 14 09:32:03 crc kubenswrapper[4886]: I0314 09:32:03.075073 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" podStartSLOduration=1.7203077100000002 podStartE2EDuration="3.075052301s" podCreationTimestamp="2026-03-14 09:32:00 +0000 UTC" firstStartedPulling="2026-03-14 09:32:00.977298873 +0000 UTC m=+3856.225750510" lastFinishedPulling="2026-03-14 09:32:02.332043454 +0000 UTC m=+3857.580495101" observedRunningTime="2026-03-14 09:32:03.071330815 +0000 UTC m=+3858.319782462" watchObservedRunningTime="2026-03-14 09:32:03.075052301 +0000 UTC m=+3858.323503938" Mar 14 09:32:04 crc kubenswrapper[4886]: I0314 09:32:04.069502 4886 generic.go:334] "Generic (PLEG): container finished" podID="a4dcb604-bc45-46f3-8ee2-756f85d73578" containerID="f710aa97aa39d3693b0a54f9550a6705ac43b4e4b5494d27c8627f81ffb95267" exitCode=0 Mar 14 09:32:04 crc kubenswrapper[4886]: I0314 09:32:04.069548 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" event={"ID":"a4dcb604-bc45-46f3-8ee2-756f85d73578","Type":"ContainerDied","Data":"f710aa97aa39d3693b0a54f9550a6705ac43b4e4b5494d27c8627f81ffb95267"} Mar 14 09:32:05 crc kubenswrapper[4886]: I0314 09:32:05.469815 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:05 crc kubenswrapper[4886]: I0314 09:32:05.638800 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbk2v\" (UniqueName: \"kubernetes.io/projected/a4dcb604-bc45-46f3-8ee2-756f85d73578-kube-api-access-jbk2v\") pod \"a4dcb604-bc45-46f3-8ee2-756f85d73578\" (UID: \"a4dcb604-bc45-46f3-8ee2-756f85d73578\") " Mar 14 09:32:05 crc kubenswrapper[4886]: I0314 09:32:05.644972 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dcb604-bc45-46f3-8ee2-756f85d73578-kube-api-access-jbk2v" (OuterVolumeSpecName: "kube-api-access-jbk2v") pod "a4dcb604-bc45-46f3-8ee2-756f85d73578" (UID: "a4dcb604-bc45-46f3-8ee2-756f85d73578"). InnerVolumeSpecName "kube-api-access-jbk2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:32:05 crc kubenswrapper[4886]: I0314 09:32:05.740998 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbk2v\" (UniqueName: \"kubernetes.io/projected/a4dcb604-bc45-46f3-8ee2-756f85d73578-kube-api-access-jbk2v\") on node \"crc\" DevicePath \"\"" Mar 14 09:32:06 crc kubenswrapper[4886]: I0314 09:32:06.121609 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" event={"ID":"a4dcb604-bc45-46f3-8ee2-756f85d73578","Type":"ContainerDied","Data":"20a0350d491b24d521eb229b40b3207583b07479e03b1459e24b9ebfe53498a7"} Mar 14 09:32:06 crc kubenswrapper[4886]: I0314 09:32:06.121948 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a0350d491b24d521eb229b40b3207583b07479e03b1459e24b9ebfe53498a7" Mar 14 09:32:06 crc kubenswrapper[4886]: I0314 09:32:06.121685 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558012-wmtwb" Mar 14 09:32:06 crc kubenswrapper[4886]: I0314 09:32:06.154384 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-5hvlv"] Mar 14 09:32:06 crc kubenswrapper[4886]: I0314 09:32:06.162658 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558006-5hvlv"] Mar 14 09:32:07 crc kubenswrapper[4886]: I0314 09:32:07.430407 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5620551c-b879-4c48-9750-5680d63676ce" path="/var/lib/kubelet/pods/5620551c-b879-4c48-9750-5680d63676ce/volumes" Mar 14 09:32:08 crc kubenswrapper[4886]: I0314 09:32:08.420847 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:32:08 crc kubenswrapper[4886]: E0314 09:32:08.421092 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:32:22 crc kubenswrapper[4886]: I0314 09:32:22.421041 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:32:22 crc kubenswrapper[4886]: E0314 09:32:22.421731 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:32:33 crc kubenswrapper[4886]: I0314 09:32:33.420900 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:32:33 crc kubenswrapper[4886]: E0314 09:32:33.421696 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:32:36 crc kubenswrapper[4886]: I0314 09:32:36.662890 4886 scope.go:117] "RemoveContainer" containerID="f265b11ffe6cab851fd431528e2d42d177708fe1ecfba753a4b7255205c020a0" Mar 14 09:32:48 crc kubenswrapper[4886]: I0314 09:32:48.421272 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:32:48 crc kubenswrapper[4886]: E0314 09:32:48.422110 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:33:01 crc kubenswrapper[4886]: I0314 09:33:01.421057 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:33:01 crc kubenswrapper[4886]: E0314 09:33:01.422944 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:33:12 crc kubenswrapper[4886]: I0314 09:33:12.421094 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:33:12 crc kubenswrapper[4886]: E0314 09:33:12.421822 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:33:27 crc kubenswrapper[4886]: I0314 09:33:27.431099 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:33:27 crc kubenswrapper[4886]: I0314 09:33:27.924164 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"e3dc0eaae2ec184f8b13c1cba1382bc451adfeb6aa8462bb92a2a5518a28a453"} Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.157431 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558014-l2v7s"] Mar 14 09:34:00 crc kubenswrapper[4886]: E0314 09:34:00.158759 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dcb604-bc45-46f3-8ee2-756f85d73578" containerName="oc" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.158784 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dcb604-bc45-46f3-8ee2-756f85d73578" containerName="oc" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.159154 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dcb604-bc45-46f3-8ee2-756f85d73578" containerName="oc" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.160345 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.163474 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.163667 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.163903 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.168315 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-l2v7s"] Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.272227 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxch5\" (UniqueName: \"kubernetes.io/projected/1e1418c9-f78d-4282-be56-cbc538dbb115-kube-api-access-cxch5\") pod \"auto-csr-approver-29558014-l2v7s\" (UID: \"1e1418c9-f78d-4282-be56-cbc538dbb115\") " pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.373921 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxch5\" (UniqueName: \"kubernetes.io/projected/1e1418c9-f78d-4282-be56-cbc538dbb115-kube-api-access-cxch5\") pod \"auto-csr-approver-29558014-l2v7s\" (UID: \"1e1418c9-f78d-4282-be56-cbc538dbb115\") " pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.402247 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxch5\" (UniqueName: \"kubernetes.io/projected/1e1418c9-f78d-4282-be56-cbc538dbb115-kube-api-access-cxch5\") pod \"auto-csr-approver-29558014-l2v7s\" (UID: \"1e1418c9-f78d-4282-be56-cbc538dbb115\") " pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:00 crc kubenswrapper[4886]: I0314 09:34:00.489900 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:01 crc kubenswrapper[4886]: I0314 09:34:01.003995 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-l2v7s"] Mar 14 09:34:01 crc kubenswrapper[4886]: I0314 09:34:01.006251 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:34:01 crc kubenswrapper[4886]: I0314 09:34:01.234170 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" event={"ID":"1e1418c9-f78d-4282-be56-cbc538dbb115","Type":"ContainerStarted","Data":"7aecb830aee2d0cb8a0ea4ee3c32f0084c61f51ade49f766eb183b5185f2516b"} Mar 14 09:34:03 crc kubenswrapper[4886]: I0314 09:34:03.270881 4886 generic.go:334] "Generic (PLEG): container finished" podID="1e1418c9-f78d-4282-be56-cbc538dbb115" containerID="d317db713a12017b52600eff37d971f6da1e2753be214230cb8580d1826634a8" exitCode=0 Mar 14 09:34:03 crc kubenswrapper[4886]: I0314 09:34:03.271218 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" event={"ID":"1e1418c9-f78d-4282-be56-cbc538dbb115","Type":"ContainerDied","Data":"d317db713a12017b52600eff37d971f6da1e2753be214230cb8580d1826634a8"} Mar 14 09:34:04 crc kubenswrapper[4886]: I0314 09:34:04.710463 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:04 crc kubenswrapper[4886]: I0314 09:34:04.768040 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxch5\" (UniqueName: \"kubernetes.io/projected/1e1418c9-f78d-4282-be56-cbc538dbb115-kube-api-access-cxch5\") pod \"1e1418c9-f78d-4282-be56-cbc538dbb115\" (UID: \"1e1418c9-f78d-4282-be56-cbc538dbb115\") " Mar 14 09:34:04 crc kubenswrapper[4886]: I0314 09:34:04.779801 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1418c9-f78d-4282-be56-cbc538dbb115-kube-api-access-cxch5" (OuterVolumeSpecName: "kube-api-access-cxch5") pod "1e1418c9-f78d-4282-be56-cbc538dbb115" (UID: "1e1418c9-f78d-4282-be56-cbc538dbb115"). InnerVolumeSpecName "kube-api-access-cxch5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:34:04 crc kubenswrapper[4886]: I0314 09:34:04.870870 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxch5\" (UniqueName: \"kubernetes.io/projected/1e1418c9-f78d-4282-be56-cbc538dbb115-kube-api-access-cxch5\") on node \"crc\" DevicePath \"\"" Mar 14 09:34:05 crc kubenswrapper[4886]: I0314 09:34:05.295084 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" event={"ID":"1e1418c9-f78d-4282-be56-cbc538dbb115","Type":"ContainerDied","Data":"7aecb830aee2d0cb8a0ea4ee3c32f0084c61f51ade49f766eb183b5185f2516b"} Mar 14 09:34:05 crc kubenswrapper[4886]: I0314 09:34:05.295462 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aecb830aee2d0cb8a0ea4ee3c32f0084c61f51ade49f766eb183b5185f2516b" Mar 14 09:34:05 crc kubenswrapper[4886]: I0314 09:34:05.295200 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558014-l2v7s" Mar 14 09:34:05 crc kubenswrapper[4886]: I0314 09:34:05.776619 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-7527v"] Mar 14 09:34:05 crc kubenswrapper[4886]: I0314 09:34:05.784287 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558008-7527v"] Mar 14 09:34:07 crc kubenswrapper[4886]: I0314 09:34:07.437887 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b385e7a-cf4d-46de-bbbe-6e75d9d2c486" path="/var/lib/kubelet/pods/4b385e7a-cf4d-46de-bbbe-6e75d9d2c486/volumes" Mar 14 09:34:36 crc kubenswrapper[4886]: I0314 09:34:36.772718 4886 scope.go:117] "RemoveContainer" containerID="021cecdb51dc5ca43366b33e1d285bbe2dad3f6135197d3662197dbc315daa4b" Mar 14 09:35:56 crc kubenswrapper[4886]: I0314 09:35:56.066878 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:35:56 crc kubenswrapper[4886]: I0314 09:35:56.067661 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.145018 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558016-p5c7v"] Mar 14 09:36:00 crc kubenswrapper[4886]: E0314 09:36:00.146100 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1418c9-f78d-4282-be56-cbc538dbb115" containerName="oc" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.146117 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1418c9-f78d-4282-be56-cbc538dbb115" containerName="oc" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.146365 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1418c9-f78d-4282-be56-cbc538dbb115" containerName="oc" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.147169 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.149956 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.150019 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.150038 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.156819 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-p5c7v"] Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.275747 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkksp\" (UniqueName: \"kubernetes.io/projected/0717bf13-48a0-403f-b4b5-9dbe68619319-kube-api-access-qkksp\") pod \"auto-csr-approver-29558016-p5c7v\" (UID: \"0717bf13-48a0-403f-b4b5-9dbe68619319\") " pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.378212 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkksp\" (UniqueName: \"kubernetes.io/projected/0717bf13-48a0-403f-b4b5-9dbe68619319-kube-api-access-qkksp\") pod \"auto-csr-approver-29558016-p5c7v\" (UID: \"0717bf13-48a0-403f-b4b5-9dbe68619319\") " pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:00 crc kubenswrapper[4886]: I0314 09:36:00.777724 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkksp\" (UniqueName: \"kubernetes.io/projected/0717bf13-48a0-403f-b4b5-9dbe68619319-kube-api-access-qkksp\") pod \"auto-csr-approver-29558016-p5c7v\" (UID: \"0717bf13-48a0-403f-b4b5-9dbe68619319\") " pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:01 crc kubenswrapper[4886]: I0314 09:36:01.068992 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:01 crc kubenswrapper[4886]: I0314 09:36:01.534944 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-p5c7v"] Mar 14 09:36:01 crc kubenswrapper[4886]: W0314 09:36:01.536796 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0717bf13_48a0_403f_b4b5_9dbe68619319.slice/crio-d6cd0484e72fbf8e61d0426fff7fa3ce75a9ffbe4471f25af5d45eca90875fb4 WatchSource:0}: Error finding container d6cd0484e72fbf8e61d0426fff7fa3ce75a9ffbe4471f25af5d45eca90875fb4: Status 404 returned error can't find the container with id d6cd0484e72fbf8e61d0426fff7fa3ce75a9ffbe4471f25af5d45eca90875fb4 Mar 14 09:36:02 crc kubenswrapper[4886]: I0314 09:36:02.519703 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" event={"ID":"0717bf13-48a0-403f-b4b5-9dbe68619319","Type":"ContainerStarted","Data":"d6cd0484e72fbf8e61d0426fff7fa3ce75a9ffbe4471f25af5d45eca90875fb4"} Mar 14 09:36:03 crc kubenswrapper[4886]: I0314 09:36:03.532742 4886 generic.go:334] "Generic (PLEG): container finished" podID="0717bf13-48a0-403f-b4b5-9dbe68619319" containerID="39056d6a10b78586b8207c24d576fbd2c758ff6c3b22b2bae058db21dfdf0f4d" exitCode=0 Mar 14 09:36:03 crc kubenswrapper[4886]: I0314 09:36:03.532860 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" event={"ID":"0717bf13-48a0-403f-b4b5-9dbe68619319","Type":"ContainerDied","Data":"39056d6a10b78586b8207c24d576fbd2c758ff6c3b22b2bae058db21dfdf0f4d"} Mar 14 09:36:04 crc kubenswrapper[4886]: I0314 09:36:04.978565 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:05 crc kubenswrapper[4886]: I0314 09:36:05.079934 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkksp\" (UniqueName: \"kubernetes.io/projected/0717bf13-48a0-403f-b4b5-9dbe68619319-kube-api-access-qkksp\") pod \"0717bf13-48a0-403f-b4b5-9dbe68619319\" (UID: \"0717bf13-48a0-403f-b4b5-9dbe68619319\") " Mar 14 09:36:05 crc kubenswrapper[4886]: I0314 09:36:05.091442 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0717bf13-48a0-403f-b4b5-9dbe68619319-kube-api-access-qkksp" (OuterVolumeSpecName: "kube-api-access-qkksp") pod "0717bf13-48a0-403f-b4b5-9dbe68619319" (UID: "0717bf13-48a0-403f-b4b5-9dbe68619319"). InnerVolumeSpecName "kube-api-access-qkksp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:05 crc kubenswrapper[4886]: I0314 09:36:05.182567 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkksp\" (UniqueName: \"kubernetes.io/projected/0717bf13-48a0-403f-b4b5-9dbe68619319-kube-api-access-qkksp\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:05 crc kubenswrapper[4886]: I0314 09:36:05.550260 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" event={"ID":"0717bf13-48a0-403f-b4b5-9dbe68619319","Type":"ContainerDied","Data":"d6cd0484e72fbf8e61d0426fff7fa3ce75a9ffbe4471f25af5d45eca90875fb4"} Mar 14 09:36:05 crc kubenswrapper[4886]: I0314 09:36:05.550302 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6cd0484e72fbf8e61d0426fff7fa3ce75a9ffbe4471f25af5d45eca90875fb4" Mar 14 09:36:05 crc kubenswrapper[4886]: I0314 09:36:05.550314 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558016-p5c7v" Mar 14 09:36:06 crc kubenswrapper[4886]: I0314 09:36:06.078519 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-jvk9f"] Mar 14 09:36:06 crc kubenswrapper[4886]: I0314 09:36:06.093771 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558010-jvk9f"] Mar 14 09:36:07 crc kubenswrapper[4886]: I0314 09:36:07.443895 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8005d191-77ad-4c55-9a44-adbe02e3b176" path="/var/lib/kubelet/pods/8005d191-77ad-4c55-9a44-adbe02e3b176/volumes" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.588111 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wzb8k"] Mar 14 09:36:19 crc kubenswrapper[4886]: E0314 09:36:19.589276 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0717bf13-48a0-403f-b4b5-9dbe68619319" containerName="oc" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.589294 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="0717bf13-48a0-403f-b4b5-9dbe68619319" containerName="oc" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.589530 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="0717bf13-48a0-403f-b4b5-9dbe68619319" containerName="oc" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.591602 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.607181 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzb8k"] Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.729085 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-utilities\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.729179 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-catalog-content\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.729223 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75c2\" (UniqueName: \"kubernetes.io/projected/dbf780f1-0609-4fa6-ba63-1fc068ea113d-kube-api-access-n75c2\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.831747 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n75c2\" (UniqueName: \"kubernetes.io/projected/dbf780f1-0609-4fa6-ba63-1fc068ea113d-kube-api-access-n75c2\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.832011 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-utilities\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.832621 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-utilities\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.832778 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-catalog-content\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.833134 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-catalog-content\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.871475 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75c2\" (UniqueName: \"kubernetes.io/projected/dbf780f1-0609-4fa6-ba63-1fc068ea113d-kube-api-access-n75c2\") pod \"certified-operators-wzb8k\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:19 crc kubenswrapper[4886]: I0314 09:36:19.917661 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:20 crc kubenswrapper[4886]: I0314 09:36:20.483515 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzb8k"] Mar 14 09:36:20 crc kubenswrapper[4886]: I0314 09:36:20.712805 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerStarted","Data":"121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619"} Mar 14 09:36:20 crc kubenswrapper[4886]: I0314 09:36:20.713134 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerStarted","Data":"a349587976515bb65f40065a28a2f8a53dc8cc64be515280779ba8b5b4ba6648"} Mar 14 09:36:21 crc kubenswrapper[4886]: I0314 09:36:21.722372 4886 generic.go:334] "Generic (PLEG): container finished" podID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerID="121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619" exitCode=0 Mar 14 09:36:21 crc kubenswrapper[4886]: I0314 09:36:21.722556 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerDied","Data":"121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619"} Mar 14 09:36:21 crc kubenswrapper[4886]: I0314 09:36:21.722701 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerStarted","Data":"f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082"} Mar 14 09:36:22 crc kubenswrapper[4886]: I0314 09:36:22.737327 4886 generic.go:334] "Generic (PLEG): container finished" podID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerID="f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082" exitCode=0 Mar 14 09:36:22 crc kubenswrapper[4886]: I0314 09:36:22.737383 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerDied","Data":"f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082"} Mar 14 09:36:24 crc kubenswrapper[4886]: I0314 09:36:24.761461 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerStarted","Data":"d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a"} Mar 14 09:36:24 crc kubenswrapper[4886]: I0314 09:36:24.789289 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wzb8k" podStartSLOduration=3.37309938 podStartE2EDuration="5.789264146s" podCreationTimestamp="2026-03-14 09:36:19 +0000 UTC" firstStartedPulling="2026-03-14 09:36:20.71435604 +0000 UTC m=+4115.962807677" lastFinishedPulling="2026-03-14 09:36:23.130520806 +0000 UTC m=+4118.378972443" observedRunningTime="2026-03-14 09:36:24.777701927 +0000 UTC m=+4120.026153564" watchObservedRunningTime="2026-03-14 09:36:24.789264146 +0000 UTC m=+4120.037715793" Mar 14 09:36:26 crc kubenswrapper[4886]: I0314 09:36:26.065780 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:36:26 crc kubenswrapper[4886]: I0314 09:36:26.066055 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:36:29 crc kubenswrapper[4886]: I0314 09:36:29.918374 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:29 crc kubenswrapper[4886]: I0314 09:36:29.919023 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:29 crc kubenswrapper[4886]: I0314 09:36:29.999530 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:30 crc kubenswrapper[4886]: I0314 09:36:30.914509 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:30 crc kubenswrapper[4886]: I0314 09:36:30.967973 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzb8k"] Mar 14 09:36:32 crc kubenswrapper[4886]: I0314 09:36:32.845007 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wzb8k" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="registry-server" containerID="cri-o://d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a" gracePeriod=2 Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.382474 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.528641 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n75c2\" (UniqueName: \"kubernetes.io/projected/dbf780f1-0609-4fa6-ba63-1fc068ea113d-kube-api-access-n75c2\") pod \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.529066 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-utilities\") pod \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.529163 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-catalog-content\") pod \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\" (UID: \"dbf780f1-0609-4fa6-ba63-1fc068ea113d\") " Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.530048 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-utilities" (OuterVolumeSpecName: "utilities") pod "dbf780f1-0609-4fa6-ba63-1fc068ea113d" (UID: "dbf780f1-0609-4fa6-ba63-1fc068ea113d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.534257 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf780f1-0609-4fa6-ba63-1fc068ea113d-kube-api-access-n75c2" (OuterVolumeSpecName: "kube-api-access-n75c2") pod "dbf780f1-0609-4fa6-ba63-1fc068ea113d" (UID: "dbf780f1-0609-4fa6-ba63-1fc068ea113d"). InnerVolumeSpecName "kube-api-access-n75c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.545138 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.545172 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n75c2\" (UniqueName: \"kubernetes.io/projected/dbf780f1-0609-4fa6-ba63-1fc068ea113d-kube-api-access-n75c2\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.582671 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbf780f1-0609-4fa6-ba63-1fc068ea113d" (UID: "dbf780f1-0609-4fa6-ba63-1fc068ea113d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.646520 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf780f1-0609-4fa6-ba63-1fc068ea113d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.855523 4886 generic.go:334] "Generic (PLEG): container finished" podID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerID="d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a" exitCode=0 Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.855564 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerDied","Data":"d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a"} Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.855596 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb8k" event={"ID":"dbf780f1-0609-4fa6-ba63-1fc068ea113d","Type":"ContainerDied","Data":"a349587976515bb65f40065a28a2f8a53dc8cc64be515280779ba8b5b4ba6648"} Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.855613 4886 scope.go:117] "RemoveContainer" containerID="d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.855622 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb8k" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.890544 4886 scope.go:117] "RemoveContainer" containerID="f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.899627 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzb8k"] Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.917572 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wzb8k"] Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.940725 4886 scope.go:117] "RemoveContainer" containerID="121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.994915 4886 scope.go:117] "RemoveContainer" containerID="d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a" Mar 14 09:36:33 crc kubenswrapper[4886]: E0314 09:36:33.995455 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a\": container with ID starting with d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a not found: ID does not exist" containerID="d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.995486 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a"} err="failed to get container status \"d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a\": rpc error: code = NotFound desc = could not find container \"d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a\": container with ID starting with d6b8e09136ed40fc1cccd90cba5bc9b5ed8959d18fe5bdb5d00a31f28dead24a not found: ID does not exist" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.995507 4886 scope.go:117] "RemoveContainer" containerID="f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082" Mar 14 09:36:33 crc kubenswrapper[4886]: E0314 09:36:33.997551 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082\": container with ID starting with f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082 not found: ID does not exist" containerID="f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.997579 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082"} err="failed to get container status \"f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082\": rpc error: code = NotFound desc = could not find container \"f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082\": container with ID starting with f0192dccdf5f8686f81892b9515bd77f48f30a8581e397899e171976ed8ab082 not found: ID does not exist" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.997593 4886 scope.go:117] "RemoveContainer" containerID="121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619" Mar 14 09:36:33 crc kubenswrapper[4886]: E0314 09:36:33.997959 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619\": container with ID starting with 121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619 not found: ID does not exist" containerID="121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619" Mar 14 09:36:33 crc kubenswrapper[4886]: I0314 09:36:33.997987 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619"} err="failed to get container status \"121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619\": rpc error: code = NotFound desc = could not find container \"121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619\": container with ID starting with 121f3cf1b020699bac0ce0746f10b63a6a482c35d9c0fed75ec4cc1eb4098619 not found: ID does not exist" Mar 14 09:36:35 crc kubenswrapper[4886]: I0314 09:36:35.439398 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" path="/var/lib/kubelet/pods/dbf780f1-0609-4fa6-ba63-1fc068ea113d/volumes" Mar 14 09:36:36 crc kubenswrapper[4886]: I0314 09:36:36.875285 4886 scope.go:117] "RemoveContainer" containerID="6e6d82e2ee3e29d006029269d60cb9f62957d66749f913cfb5daa079a558e6ac" Mar 14 09:36:56 crc kubenswrapper[4886]: I0314 09:36:56.066994 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:36:56 crc kubenswrapper[4886]: I0314 09:36:56.067509 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:36:56 crc kubenswrapper[4886]: I0314 09:36:56.067554 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:36:56 crc kubenswrapper[4886]: I0314 09:36:56.068464 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3dc0eaae2ec184f8b13c1cba1382bc451adfeb6aa8462bb92a2a5518a28a453"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:36:56 crc kubenswrapper[4886]: I0314 09:36:56.068542 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://e3dc0eaae2ec184f8b13c1cba1382bc451adfeb6aa8462bb92a2a5518a28a453" gracePeriod=600 Mar 14 09:36:57 crc kubenswrapper[4886]: I0314 09:36:57.085097 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="e3dc0eaae2ec184f8b13c1cba1382bc451adfeb6aa8462bb92a2a5518a28a453" exitCode=0 Mar 14 09:36:57 crc kubenswrapper[4886]: I0314 09:36:57.085168 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"e3dc0eaae2ec184f8b13c1cba1382bc451adfeb6aa8462bb92a2a5518a28a453"} Mar 14 09:36:57 crc kubenswrapper[4886]: I0314 09:36:57.085666 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52"} Mar 14 09:36:57 crc kubenswrapper[4886]: I0314 09:36:57.085690 4886 scope.go:117] "RemoveContainer" containerID="8a62266423d89c6f0106f0eb8cb7c347d45c34258be491476b1ef8492fa599ee" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.311535 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dmbqb"] Mar 14 09:37:40 crc kubenswrapper[4886]: E0314 09:37:40.312907 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="registry-server" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.312924 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="registry-server" Mar 14 09:37:40 crc kubenswrapper[4886]: E0314 09:37:40.312944 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="extract-content" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.312954 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="extract-content" Mar 14 09:37:40 crc kubenswrapper[4886]: E0314 09:37:40.312973 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="extract-utilities" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.312981 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="extract-utilities" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.313314 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf780f1-0609-4fa6-ba63-1fc068ea113d" containerName="registry-server" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.315237 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.330005 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmbqb"] Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.385511 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-utilities\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.385626 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-catalog-content\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.385718 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqsq\" (UniqueName: \"kubernetes.io/projected/bf22d552-c0e3-4a20-8da7-5247add98e45-kube-api-access-vzqsq\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.489286 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqsq\" (UniqueName: \"kubernetes.io/projected/bf22d552-c0e3-4a20-8da7-5247add98e45-kube-api-access-vzqsq\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.489808 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-utilities\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.489888 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-catalog-content\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.490469 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-catalog-content\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.491650 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-utilities\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.516986 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqsq\" (UniqueName: \"kubernetes.io/projected/bf22d552-c0e3-4a20-8da7-5247add98e45-kube-api-access-vzqsq\") pod \"redhat-marketplace-dmbqb\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:40 crc kubenswrapper[4886]: I0314 09:37:40.654025 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:41 crc kubenswrapper[4886]: I0314 09:37:41.164944 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmbqb"] Mar 14 09:37:41 crc kubenswrapper[4886]: I0314 09:37:41.546701 4886 generic.go:334] "Generic (PLEG): container finished" podID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerID="a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6" exitCode=0 Mar 14 09:37:41 crc kubenswrapper[4886]: I0314 09:37:41.546821 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerDied","Data":"a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6"} Mar 14 09:37:41 crc kubenswrapper[4886]: I0314 09:37:41.547028 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerStarted","Data":"720fd4ea4cc83e8a6fcffb58af7e9dbb24ad2dd981f4a68bfd5ee29d4225fc2b"} Mar 14 09:37:42 crc kubenswrapper[4886]: I0314 09:37:42.558373 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerStarted","Data":"8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea"} Mar 14 09:37:43 crc kubenswrapper[4886]: I0314 09:37:43.576169 4886 generic.go:334] "Generic (PLEG): container finished" podID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerID="8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea" exitCode=0 Mar 14 09:37:43 crc kubenswrapper[4886]: I0314 09:37:43.576253 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerDied","Data":"8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea"} Mar 14 09:37:44 crc kubenswrapper[4886]: I0314 09:37:44.587763 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerStarted","Data":"83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2"} Mar 14 09:37:44 crc kubenswrapper[4886]: I0314 09:37:44.609266 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dmbqb" podStartSLOduration=2.164089988 podStartE2EDuration="4.60924479s" podCreationTimestamp="2026-03-14 09:37:40 +0000 UTC" firstStartedPulling="2026-03-14 09:37:41.548396947 +0000 UTC m=+4196.796848584" lastFinishedPulling="2026-03-14 09:37:43.993551739 +0000 UTC m=+4199.242003386" observedRunningTime="2026-03-14 09:37:44.604999579 +0000 UTC m=+4199.853451216" watchObservedRunningTime="2026-03-14 09:37:44.60924479 +0000 UTC m=+4199.857696427" Mar 14 09:37:50 crc kubenswrapper[4886]: I0314 09:37:50.655157 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:50 crc kubenswrapper[4886]: I0314 09:37:50.655723 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:50 crc kubenswrapper[4886]: I0314 09:37:50.700635 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:51 crc kubenswrapper[4886]: I0314 09:37:51.698471 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:51 crc kubenswrapper[4886]: I0314 09:37:51.757955 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmbqb"] Mar 14 09:37:53 crc kubenswrapper[4886]: I0314 09:37:53.665294 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dmbqb" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="registry-server" containerID="cri-o://83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2" gracePeriod=2 Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.260914 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.345292 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzqsq\" (UniqueName: \"kubernetes.io/projected/bf22d552-c0e3-4a20-8da7-5247add98e45-kube-api-access-vzqsq\") pod \"bf22d552-c0e3-4a20-8da7-5247add98e45\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.345775 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-catalog-content\") pod \"bf22d552-c0e3-4a20-8da7-5247add98e45\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.345943 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-utilities\") pod \"bf22d552-c0e3-4a20-8da7-5247add98e45\" (UID: \"bf22d552-c0e3-4a20-8da7-5247add98e45\") " Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.348497 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-utilities" (OuterVolumeSpecName: "utilities") pod "bf22d552-c0e3-4a20-8da7-5247add98e45" (UID: "bf22d552-c0e3-4a20-8da7-5247add98e45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.354447 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf22d552-c0e3-4a20-8da7-5247add98e45-kube-api-access-vzqsq" (OuterVolumeSpecName: "kube-api-access-vzqsq") pod "bf22d552-c0e3-4a20-8da7-5247add98e45" (UID: "bf22d552-c0e3-4a20-8da7-5247add98e45"). InnerVolumeSpecName "kube-api-access-vzqsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.374082 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf22d552-c0e3-4a20-8da7-5247add98e45" (UID: "bf22d552-c0e3-4a20-8da7-5247add98e45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.449585 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzqsq\" (UniqueName: \"kubernetes.io/projected/bf22d552-c0e3-4a20-8da7-5247add98e45-kube-api-access-vzqsq\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.449628 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.449641 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22d552-c0e3-4a20-8da7-5247add98e45-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.677140 4886 generic.go:334] "Generic (PLEG): container finished" podID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerID="83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2" exitCode=0 Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.677195 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerDied","Data":"83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2"} Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.677198 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmbqb" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.677229 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmbqb" event={"ID":"bf22d552-c0e3-4a20-8da7-5247add98e45","Type":"ContainerDied","Data":"720fd4ea4cc83e8a6fcffb58af7e9dbb24ad2dd981f4a68bfd5ee29d4225fc2b"} Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.677257 4886 scope.go:117] "RemoveContainer" containerID="83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.705565 4886 scope.go:117] "RemoveContainer" containerID="8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.711290 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmbqb"] Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.724696 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmbqb"] Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.742343 4886 scope.go:117] "RemoveContainer" containerID="a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.787465 4886 scope.go:117] "RemoveContainer" containerID="83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2" Mar 14 09:37:54 crc kubenswrapper[4886]: E0314 09:37:54.787947 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2\": container with ID starting with 83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2 not found: ID does not exist" containerID="83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.787981 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2"} err="failed to get container status \"83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2\": rpc error: code = NotFound desc = could not find container \"83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2\": container with ID starting with 83151493899ef8249ca65e1ce2bb54a905632c3b76c9661451e5b39ae33ce8c2 not found: ID does not exist" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.788000 4886 scope.go:117] "RemoveContainer" containerID="8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea" Mar 14 09:37:54 crc kubenswrapper[4886]: E0314 09:37:54.788402 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea\": container with ID starting with 8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea not found: ID does not exist" containerID="8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.788523 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea"} err="failed to get container status \"8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea\": rpc error: code = NotFound desc = could not find container \"8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea\": container with ID starting with 8cb98c402587271568303256f4376bc6e2e60584ab82b4e42c3a54f5f3c35fea not found: ID does not exist" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.788625 4886 scope.go:117] "RemoveContainer" containerID="a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6" Mar 14 09:37:54 crc kubenswrapper[4886]: E0314 09:37:54.788990 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6\": container with ID starting with a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6 not found: ID does not exist" containerID="a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6" Mar 14 09:37:54 crc kubenswrapper[4886]: I0314 09:37:54.789019 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6"} err="failed to get container status \"a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6\": rpc error: code = NotFound desc = could not find container \"a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6\": container with ID starting with a3a655b87030e02434c53fd3c6280add9bb878b9bac5adda7b4d7a32ba3b28f6 not found: ID does not exist" Mar 14 09:37:55 crc kubenswrapper[4886]: I0314 09:37:55.441253 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" path="/var/lib/kubelet/pods/bf22d552-c0e3-4a20-8da7-5247add98e45/volumes" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.156251 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558018-lnv6t"] Mar 14 09:38:00 crc kubenswrapper[4886]: E0314 09:38:00.157183 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="extract-utilities" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.157198 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="extract-utilities" Mar 14 09:38:00 crc kubenswrapper[4886]: E0314 09:38:00.157212 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="extract-content" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.157218 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="extract-content" Mar 14 09:38:00 crc kubenswrapper[4886]: E0314 09:38:00.157235 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="registry-server" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.157241 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="registry-server" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.157422 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf22d552-c0e3-4a20-8da7-5247add98e45" containerName="registry-server" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.158085 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.160620 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.161745 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.163846 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.167654 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-lnv6t"] Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.272636 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/bd596c5c-1c7d-4937-8597-1b17084831b1-kube-api-access-7zc4x\") pod \"auto-csr-approver-29558018-lnv6t\" (UID: \"bd596c5c-1c7d-4937-8597-1b17084831b1\") " pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.374292 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/bd596c5c-1c7d-4937-8597-1b17084831b1-kube-api-access-7zc4x\") pod \"auto-csr-approver-29558018-lnv6t\" (UID: \"bd596c5c-1c7d-4937-8597-1b17084831b1\") " pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.394781 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/bd596c5c-1c7d-4937-8597-1b17084831b1-kube-api-access-7zc4x\") pod \"auto-csr-approver-29558018-lnv6t\" (UID: \"bd596c5c-1c7d-4937-8597-1b17084831b1\") " pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.510662 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:00 crc kubenswrapper[4886]: I0314 09:38:00.984147 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-lnv6t"] Mar 14 09:38:01 crc kubenswrapper[4886]: I0314 09:38:01.745961 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" event={"ID":"bd596c5c-1c7d-4937-8597-1b17084831b1","Type":"ContainerStarted","Data":"3675b5cd34745f509a2cc62e3587cc2664f5f988e753f7694db61b4b6e4c2f59"} Mar 14 09:38:02 crc kubenswrapper[4886]: I0314 09:38:02.764584 4886 generic.go:334] "Generic (PLEG): container finished" podID="bd596c5c-1c7d-4937-8597-1b17084831b1" containerID="24d3977b769b29ed5d0411dbac3ea84d790cbdded345cafabda6a2e3c47eb24c" exitCode=0 Mar 14 09:38:02 crc kubenswrapper[4886]: I0314 09:38:02.764941 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" event={"ID":"bd596c5c-1c7d-4937-8597-1b17084831b1","Type":"ContainerDied","Data":"24d3977b769b29ed5d0411dbac3ea84d790cbdded345cafabda6a2e3c47eb24c"} Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.248726 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.364883 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/bd596c5c-1c7d-4937-8597-1b17084831b1-kube-api-access-7zc4x\") pod \"bd596c5c-1c7d-4937-8597-1b17084831b1\" (UID: \"bd596c5c-1c7d-4937-8597-1b17084831b1\") " Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.371778 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd596c5c-1c7d-4937-8597-1b17084831b1-kube-api-access-7zc4x" (OuterVolumeSpecName: "kube-api-access-7zc4x") pod "bd596c5c-1c7d-4937-8597-1b17084831b1" (UID: "bd596c5c-1c7d-4937-8597-1b17084831b1"). InnerVolumeSpecName "kube-api-access-7zc4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.467773 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zc4x\" (UniqueName: \"kubernetes.io/projected/bd596c5c-1c7d-4937-8597-1b17084831b1-kube-api-access-7zc4x\") on node \"crc\" DevicePath \"\"" Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.789515 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" event={"ID":"bd596c5c-1c7d-4937-8597-1b17084831b1","Type":"ContainerDied","Data":"3675b5cd34745f509a2cc62e3587cc2664f5f988e753f7694db61b4b6e4c2f59"} Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.789556 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3675b5cd34745f509a2cc62e3587cc2664f5f988e753f7694db61b4b6e4c2f59" Mar 14 09:38:04 crc kubenswrapper[4886]: I0314 09:38:04.789595 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558018-lnv6t" Mar 14 09:38:05 crc kubenswrapper[4886]: I0314 09:38:05.345703 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-wmtwb"] Mar 14 09:38:05 crc kubenswrapper[4886]: I0314 09:38:05.354077 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558012-wmtwb"] Mar 14 09:38:05 crc kubenswrapper[4886]: I0314 09:38:05.433717 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dcb604-bc45-46f3-8ee2-756f85d73578" path="/var/lib/kubelet/pods/a4dcb604-bc45-46f3-8ee2-756f85d73578/volumes" Mar 14 09:38:37 crc kubenswrapper[4886]: I0314 09:38:37.013452 4886 scope.go:117] "RemoveContainer" containerID="f710aa97aa39d3693b0a54f9550a6705ac43b4e4b5494d27c8627f81ffb95267" Mar 14 09:38:56 crc kubenswrapper[4886]: I0314 09:38:56.066688 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:38:56 crc kubenswrapper[4886]: I0314 09:38:56.067272 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:39:26 crc kubenswrapper[4886]: I0314 09:39:26.066738 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:39:26 crc kubenswrapper[4886]: I0314 09:39:26.067649 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:39:56 crc kubenswrapper[4886]: I0314 09:39:56.066705 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:39:56 crc kubenswrapper[4886]: I0314 09:39:56.067544 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:39:56 crc kubenswrapper[4886]: I0314 09:39:56.067600 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:39:56 crc kubenswrapper[4886]: I0314 09:39:56.068543 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:39:56 crc kubenswrapper[4886]: I0314 09:39:56.068610 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" gracePeriod=600 Mar 14 09:39:56 crc kubenswrapper[4886]: E0314 09:39:56.696309 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:39:57 crc kubenswrapper[4886]: I0314 09:39:57.095661 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" exitCode=0 Mar 14 09:39:57 crc kubenswrapper[4886]: I0314 09:39:57.095738 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52"} Mar 14 09:39:57 crc kubenswrapper[4886]: I0314 09:39:57.095998 4886 scope.go:117] "RemoveContainer" containerID="e3dc0eaae2ec184f8b13c1cba1382bc451adfeb6aa8462bb92a2a5518a28a453" Mar 14 09:39:57 crc kubenswrapper[4886]: I0314 09:39:57.097387 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:39:57 crc kubenswrapper[4886]: E0314 09:39:57.098160 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.180656 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558020-ddgfg"] Mar 14 09:40:00 crc kubenswrapper[4886]: E0314 09:40:00.181989 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd596c5c-1c7d-4937-8597-1b17084831b1" containerName="oc" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.182435 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd596c5c-1c7d-4937-8597-1b17084831b1" containerName="oc" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.183034 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd596c5c-1c7d-4937-8597-1b17084831b1" containerName="oc" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.184727 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.187861 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.188052 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.188434 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.193870 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-ddgfg"] Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.320329 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkzbg\" (UniqueName: \"kubernetes.io/projected/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04-kube-api-access-rkzbg\") pod \"auto-csr-approver-29558020-ddgfg\" (UID: \"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04\") " pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.421764 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkzbg\" (UniqueName: \"kubernetes.io/projected/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04-kube-api-access-rkzbg\") pod \"auto-csr-approver-29558020-ddgfg\" (UID: \"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04\") " pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.443893 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkzbg\" (UniqueName: \"kubernetes.io/projected/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04-kube-api-access-rkzbg\") pod \"auto-csr-approver-29558020-ddgfg\" (UID: \"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04\") " pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:00 crc kubenswrapper[4886]: I0314 09:40:00.525078 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:01 crc kubenswrapper[4886]: I0314 09:40:01.025388 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-ddgfg"] Mar 14 09:40:01 crc kubenswrapper[4886]: I0314 09:40:01.030219 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:40:01 crc kubenswrapper[4886]: I0314 09:40:01.159651 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" event={"ID":"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04","Type":"ContainerStarted","Data":"870a5a278f19501b104c5ab84d78d6970285f0804508b3c4d15f5b261eaaaf8e"} Mar 14 09:40:03 crc kubenswrapper[4886]: I0314 09:40:03.209261 4886 generic.go:334] "Generic (PLEG): container finished" podID="b8ceb013-9172-4aef-a7fd-2cebdf3f7d04" containerID="e7006e1c2a7612d626f174bb6e1137c96ae29c0e177b327bd0873ca31d7c9c7a" exitCode=0 Mar 14 09:40:03 crc kubenswrapper[4886]: I0314 09:40:03.209402 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" event={"ID":"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04","Type":"ContainerDied","Data":"e7006e1c2a7612d626f174bb6e1137c96ae29c0e177b327bd0873ca31d7c9c7a"} Mar 14 09:40:04 crc kubenswrapper[4886]: I0314 09:40:04.677103 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:04 crc kubenswrapper[4886]: I0314 09:40:04.820868 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkzbg\" (UniqueName: \"kubernetes.io/projected/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04-kube-api-access-rkzbg\") pod \"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04\" (UID: \"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04\") " Mar 14 09:40:04 crc kubenswrapper[4886]: I0314 09:40:04.833393 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04-kube-api-access-rkzbg" (OuterVolumeSpecName: "kube-api-access-rkzbg") pod "b8ceb013-9172-4aef-a7fd-2cebdf3f7d04" (UID: "b8ceb013-9172-4aef-a7fd-2cebdf3f7d04"). InnerVolumeSpecName "kube-api-access-rkzbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:04 crc kubenswrapper[4886]: I0314 09:40:04.925030 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkzbg\" (UniqueName: \"kubernetes.io/projected/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04-kube-api-access-rkzbg\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:05 crc kubenswrapper[4886]: I0314 09:40:05.229972 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" event={"ID":"b8ceb013-9172-4aef-a7fd-2cebdf3f7d04","Type":"ContainerDied","Data":"870a5a278f19501b104c5ab84d78d6970285f0804508b3c4d15f5b261eaaaf8e"} Mar 14 09:40:05 crc kubenswrapper[4886]: I0314 09:40:05.230030 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870a5a278f19501b104c5ab84d78d6970285f0804508b3c4d15f5b261eaaaf8e" Mar 14 09:40:05 crc kubenswrapper[4886]: I0314 09:40:05.230069 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558020-ddgfg" Mar 14 09:40:05 crc kubenswrapper[4886]: I0314 09:40:05.763957 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-l2v7s"] Mar 14 09:40:05 crc kubenswrapper[4886]: I0314 09:40:05.773780 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558014-l2v7s"] Mar 14 09:40:07 crc kubenswrapper[4886]: I0314 09:40:07.439319 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1418c9-f78d-4282-be56-cbc538dbb115" path="/var/lib/kubelet/pods/1e1418c9-f78d-4282-be56-cbc538dbb115/volumes" Mar 14 09:40:11 crc kubenswrapper[4886]: I0314 09:40:11.420772 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:40:11 crc kubenswrapper[4886]: E0314 09:40:11.421805 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.719892 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrdnv"] Mar 14 09:40:18 crc kubenswrapper[4886]: E0314 09:40:18.721043 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ceb013-9172-4aef-a7fd-2cebdf3f7d04" containerName="oc" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.721061 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ceb013-9172-4aef-a7fd-2cebdf3f7d04" containerName="oc" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.721313 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ceb013-9172-4aef-a7fd-2cebdf3f7d04" containerName="oc" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.723031 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.755300 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrdnv"] Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.854462 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-utilities\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.854569 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-catalog-content\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.854783 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6t22\" (UniqueName: \"kubernetes.io/projected/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-kube-api-access-m6t22\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.956626 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6t22\" (UniqueName: \"kubernetes.io/projected/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-kube-api-access-m6t22\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.956748 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-utilities\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.956785 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-catalog-content\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.957309 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-utilities\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.957329 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-catalog-content\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:18 crc kubenswrapper[4886]: I0314 09:40:18.974888 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6t22\" (UniqueName: \"kubernetes.io/projected/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-kube-api-access-m6t22\") pod \"redhat-operators-qrdnv\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:19 crc kubenswrapper[4886]: I0314 09:40:19.053205 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:19 crc kubenswrapper[4886]: I0314 09:40:19.533963 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrdnv"] Mar 14 09:40:20 crc kubenswrapper[4886]: I0314 09:40:20.391835 4886 generic.go:334] "Generic (PLEG): container finished" podID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerID="2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c" exitCode=0 Mar 14 09:40:20 crc kubenswrapper[4886]: I0314 09:40:20.391899 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerDied","Data":"2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c"} Mar 14 09:40:20 crc kubenswrapper[4886]: I0314 09:40:20.392442 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerStarted","Data":"ed3e1ddbcc17c9a303e416b308ebb3f24fe5e72d34b94c716334d9020bbca641"} Mar 14 09:40:21 crc kubenswrapper[4886]: I0314 09:40:21.405293 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerStarted","Data":"ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492"} Mar 14 09:40:24 crc kubenswrapper[4886]: I0314 09:40:24.434819 4886 generic.go:334] "Generic (PLEG): container finished" podID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerID="ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492" exitCode=0 Mar 14 09:40:24 crc kubenswrapper[4886]: I0314 09:40:24.434888 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerDied","Data":"ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492"} Mar 14 09:40:25 crc kubenswrapper[4886]: I0314 09:40:25.446714 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerStarted","Data":"81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823"} Mar 14 09:40:25 crc kubenswrapper[4886]: I0314 09:40:25.479820 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrdnv" podStartSLOduration=2.954397561 podStartE2EDuration="7.479796114s" podCreationTimestamp="2026-03-14 09:40:18 +0000 UTC" firstStartedPulling="2026-03-14 09:40:20.393892642 +0000 UTC m=+4355.642344279" lastFinishedPulling="2026-03-14 09:40:24.919291185 +0000 UTC m=+4360.167742832" observedRunningTime="2026-03-14 09:40:25.469309435 +0000 UTC m=+4360.717761072" watchObservedRunningTime="2026-03-14 09:40:25.479796114 +0000 UTC m=+4360.728247751" Mar 14 09:40:26 crc kubenswrapper[4886]: I0314 09:40:26.421165 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:40:26 crc kubenswrapper[4886]: E0314 09:40:26.421814 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:40:29 crc kubenswrapper[4886]: I0314 09:40:29.053778 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:29 crc kubenswrapper[4886]: I0314 09:40:29.054474 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:30 crc kubenswrapper[4886]: I0314 09:40:30.114557 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrdnv" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="registry-server" probeResult="failure" output=< Mar 14 09:40:30 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:40:30 crc kubenswrapper[4886]: > Mar 14 09:40:37 crc kubenswrapper[4886]: I0314 09:40:37.146990 4886 scope.go:117] "RemoveContainer" containerID="d317db713a12017b52600eff37d971f6da1e2753be214230cb8580d1826634a8" Mar 14 09:40:38 crc kubenswrapper[4886]: I0314 09:40:38.420640 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:40:38 crc kubenswrapper[4886]: E0314 09:40:38.420912 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:40:39 crc kubenswrapper[4886]: I0314 09:40:39.104528 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:39 crc kubenswrapper[4886]: I0314 09:40:39.171224 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:39 crc kubenswrapper[4886]: I0314 09:40:39.342558 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrdnv"] Mar 14 09:40:40 crc kubenswrapper[4886]: I0314 09:40:40.617805 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrdnv" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="registry-server" containerID="cri-o://81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823" gracePeriod=2 Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.094146 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.124649 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-catalog-content\") pod \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.124986 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6t22\" (UniqueName: \"kubernetes.io/projected/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-kube-api-access-m6t22\") pod \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.125060 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-utilities\") pod \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\" (UID: \"fbfd5efb-b236-41e5-b7f9-727bce87c3d7\") " Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.126477 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-utilities" (OuterVolumeSpecName: "utilities") pod "fbfd5efb-b236-41e5-b7f9-727bce87c3d7" (UID: "fbfd5efb-b236-41e5-b7f9-727bce87c3d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.140378 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-kube-api-access-m6t22" (OuterVolumeSpecName: "kube-api-access-m6t22") pod "fbfd5efb-b236-41e5-b7f9-727bce87c3d7" (UID: "fbfd5efb-b236-41e5-b7f9-727bce87c3d7"). InnerVolumeSpecName "kube-api-access-m6t22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.227412 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6t22\" (UniqueName: \"kubernetes.io/projected/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-kube-api-access-m6t22\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.227832 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.259532 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbfd5efb-b236-41e5-b7f9-727bce87c3d7" (UID: "fbfd5efb-b236-41e5-b7f9-727bce87c3d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.331215 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbfd5efb-b236-41e5-b7f9-727bce87c3d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.632611 4886 generic.go:334] "Generic (PLEG): container finished" podID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerID="81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823" exitCode=0 Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.632664 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerDied","Data":"81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823"} Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.632698 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrdnv" event={"ID":"fbfd5efb-b236-41e5-b7f9-727bce87c3d7","Type":"ContainerDied","Data":"ed3e1ddbcc17c9a303e416b308ebb3f24fe5e72d34b94c716334d9020bbca641"} Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.632723 4886 scope.go:117] "RemoveContainer" containerID="81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.632742 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrdnv" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.666453 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrdnv"] Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.668975 4886 scope.go:117] "RemoveContainer" containerID="ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.678274 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrdnv"] Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.698403 4886 scope.go:117] "RemoveContainer" containerID="2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.748065 4886 scope.go:117] "RemoveContainer" containerID="81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823" Mar 14 09:40:41 crc kubenswrapper[4886]: E0314 09:40:41.748866 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823\": container with ID starting with 81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823 not found: ID does not exist" containerID="81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.748946 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823"} err="failed to get container status \"81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823\": rpc error: code = NotFound desc = could not find container \"81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823\": container with ID starting with 81f42c342e38118fbc4b1446d52222159feb36043de676cb900f62be9ae65823 not found: ID does not exist" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.748988 4886 scope.go:117] "RemoveContainer" containerID="ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492" Mar 14 09:40:41 crc kubenswrapper[4886]: E0314 09:40:41.749503 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492\": container with ID starting with ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492 not found: ID does not exist" containerID="ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.749656 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492"} err="failed to get container status \"ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492\": rpc error: code = NotFound desc = could not find container \"ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492\": container with ID starting with ce4665964e9600c389442e8d618f0ead72cc171f435c6f0045fd401ace29d492 not found: ID does not exist" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.749767 4886 scope.go:117] "RemoveContainer" containerID="2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c" Mar 14 09:40:41 crc kubenswrapper[4886]: E0314 09:40:41.750488 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c\": container with ID starting with 2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c not found: ID does not exist" containerID="2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c" Mar 14 09:40:41 crc kubenswrapper[4886]: I0314 09:40:41.750547 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c"} err="failed to get container status \"2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c\": rpc error: code = NotFound desc = could not find container \"2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c\": container with ID starting with 2fe0cb0ffe92376226ac38433038872a5055b8f04fb271e2e91da8391fb1f32c not found: ID does not exist" Mar 14 09:40:43 crc kubenswrapper[4886]: I0314 09:40:43.434015 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" path="/var/lib/kubelet/pods/fbfd5efb-b236-41e5-b7f9-727bce87c3d7/volumes" Mar 14 09:40:52 crc kubenswrapper[4886]: I0314 09:40:52.421645 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:40:52 crc kubenswrapper[4886]: E0314 09:40:52.422787 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:41:07 crc kubenswrapper[4886]: I0314 09:41:07.421046 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:41:07 crc kubenswrapper[4886]: E0314 09:41:07.422152 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:41:21 crc kubenswrapper[4886]: I0314 09:41:21.421768 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:41:21 crc kubenswrapper[4886]: E0314 09:41:21.422660 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:41:33 crc kubenswrapper[4886]: I0314 09:41:33.421249 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:41:33 crc kubenswrapper[4886]: E0314 09:41:33.421916 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:41:44 crc kubenswrapper[4886]: I0314 09:41:44.420288 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:41:44 crc kubenswrapper[4886]: E0314 09:41:44.420901 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.158082 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-npjpk"] Mar 14 09:41:49 crc kubenswrapper[4886]: E0314 09:41:49.159562 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="extract-utilities" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.159580 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="extract-utilities" Mar 14 09:41:49 crc kubenswrapper[4886]: E0314 09:41:49.159605 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="registry-server" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.159614 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="registry-server" Mar 14 09:41:49 crc kubenswrapper[4886]: E0314 09:41:49.159638 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="extract-content" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.159646 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="extract-content" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.159921 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfd5efb-b236-41e5-b7f9-727bce87c3d7" containerName="registry-server" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.161695 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.169509 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npjpk"] Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.199250 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-catalog-content\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.199328 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7cc\" (UniqueName: \"kubernetes.io/projected/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-kube-api-access-gg7cc\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.199423 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-utilities\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.301373 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-utilities\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.301519 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-catalog-content\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.301580 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7cc\" (UniqueName: \"kubernetes.io/projected/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-kube-api-access-gg7cc\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.302212 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-catalog-content\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.302995 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-utilities\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.356991 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7cc\" (UniqueName: \"kubernetes.io/projected/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-kube-api-access-gg7cc\") pod \"community-operators-npjpk\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:49 crc kubenswrapper[4886]: I0314 09:41:49.498180 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:50 crc kubenswrapper[4886]: I0314 09:41:50.030959 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npjpk"] Mar 14 09:41:50 crc kubenswrapper[4886]: I0314 09:41:50.375209 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerID="6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4" exitCode=0 Mar 14 09:41:50 crc kubenswrapper[4886]: I0314 09:41:50.375322 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerDied","Data":"6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4"} Mar 14 09:41:50 crc kubenswrapper[4886]: I0314 09:41:50.375501 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerStarted","Data":"0096e956b6de92b12c7d103d7f48999a8d7fe59a2af08696275f4c2541391b3e"} Mar 14 09:41:51 crc kubenswrapper[4886]: I0314 09:41:51.386603 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerStarted","Data":"34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167"} Mar 14 09:41:52 crc kubenswrapper[4886]: I0314 09:41:52.398412 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerID="34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167" exitCode=0 Mar 14 09:41:52 crc kubenswrapper[4886]: I0314 09:41:52.398481 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerDied","Data":"34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167"} Mar 14 09:41:53 crc kubenswrapper[4886]: I0314 09:41:53.412677 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerStarted","Data":"85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f"} Mar 14 09:41:53 crc kubenswrapper[4886]: I0314 09:41:53.474464 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-npjpk" podStartSLOduration=1.985784447 podStartE2EDuration="4.474439467s" podCreationTimestamp="2026-03-14 09:41:49 +0000 UTC" firstStartedPulling="2026-03-14 09:41:50.376838819 +0000 UTC m=+4445.625290456" lastFinishedPulling="2026-03-14 09:41:52.865493839 +0000 UTC m=+4448.113945476" observedRunningTime="2026-03-14 09:41:53.468275762 +0000 UTC m=+4448.716727429" watchObservedRunningTime="2026-03-14 09:41:53.474439467 +0000 UTC m=+4448.722891104" Mar 14 09:41:55 crc kubenswrapper[4886]: I0314 09:41:55.428217 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:41:55 crc kubenswrapper[4886]: E0314 09:41:55.429938 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:41:59 crc kubenswrapper[4886]: I0314 09:41:59.498448 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:59 crc kubenswrapper[4886]: I0314 09:41:59.499110 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:41:59 crc kubenswrapper[4886]: I0314 09:41:59.541254 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.155415 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558022-nfts2"] Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.157640 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.161076 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.161450 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.163378 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.179726 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-nfts2"] Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.267999 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kznmn\" (UniqueName: \"kubernetes.io/projected/a189f4a3-867c-46bb-9aa7-b4a434917b12-kube-api-access-kznmn\") pod \"auto-csr-approver-29558022-nfts2\" (UID: \"a189f4a3-867c-46bb-9aa7-b4a434917b12\") " pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.370897 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kznmn\" (UniqueName: \"kubernetes.io/projected/a189f4a3-867c-46bb-9aa7-b4a434917b12-kube-api-access-kznmn\") pod \"auto-csr-approver-29558022-nfts2\" (UID: \"a189f4a3-867c-46bb-9aa7-b4a434917b12\") " pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.394923 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kznmn\" (UniqueName: \"kubernetes.io/projected/a189f4a3-867c-46bb-9aa7-b4a434917b12-kube-api-access-kznmn\") pod \"auto-csr-approver-29558022-nfts2\" (UID: \"a189f4a3-867c-46bb-9aa7-b4a434917b12\") " pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.482084 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.537320 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.608975 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npjpk"] Mar 14 09:42:00 crc kubenswrapper[4886]: I0314 09:42:00.968317 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-nfts2"] Mar 14 09:42:01 crc kubenswrapper[4886]: I0314 09:42:01.489153 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-nfts2" event={"ID":"a189f4a3-867c-46bb-9aa7-b4a434917b12","Type":"ContainerStarted","Data":"5372d0e7234a75194ac42b95055d26b24aa97cb75056c484378e5ade920f7c4d"} Mar 14 09:42:02 crc kubenswrapper[4886]: I0314 09:42:02.500439 4886 generic.go:334] "Generic (PLEG): container finished" podID="a189f4a3-867c-46bb-9aa7-b4a434917b12" containerID="99ac067a5e543929b824d308c26a57bd05a594e91bd658d2c97b1ce004a10aec" exitCode=0 Mar 14 09:42:02 crc kubenswrapper[4886]: I0314 09:42:02.501213 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-npjpk" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="registry-server" containerID="cri-o://85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f" gracePeriod=2 Mar 14 09:42:02 crc kubenswrapper[4886]: I0314 09:42:02.501616 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-nfts2" event={"ID":"a189f4a3-867c-46bb-9aa7-b4a434917b12","Type":"ContainerDied","Data":"99ac067a5e543929b824d308c26a57bd05a594e91bd658d2c97b1ce004a10aec"} Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.038964 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.132197 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg7cc\" (UniqueName: \"kubernetes.io/projected/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-kube-api-access-gg7cc\") pod \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.132266 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-catalog-content\") pod \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.132439 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-utilities\") pod \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\" (UID: \"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3\") " Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.133881 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-utilities" (OuterVolumeSpecName: "utilities") pod "4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" (UID: "4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.140173 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-kube-api-access-gg7cc" (OuterVolumeSpecName: "kube-api-access-gg7cc") pod "4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" (UID: "4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3"). InnerVolumeSpecName "kube-api-access-gg7cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.190903 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" (UID: "4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.233888 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.233928 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg7cc\" (UniqueName: \"kubernetes.io/projected/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-kube-api-access-gg7cc\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.233939 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.517654 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerID="85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f" exitCode=0 Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.517733 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerDied","Data":"85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f"} Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.517780 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npjpk" event={"ID":"4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3","Type":"ContainerDied","Data":"0096e956b6de92b12c7d103d7f48999a8d7fe59a2af08696275f4c2541391b3e"} Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.517805 4886 scope.go:117] "RemoveContainer" containerID="85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.517820 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npjpk" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.549330 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npjpk"] Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.561038 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-npjpk"] Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.564312 4886 scope.go:117] "RemoveContainer" containerID="34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.597497 4886 scope.go:117] "RemoveContainer" containerID="6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.634058 4886 scope.go:117] "RemoveContainer" containerID="85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f" Mar 14 09:42:03 crc kubenswrapper[4886]: E0314 09:42:03.634640 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f\": container with ID starting with 85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f not found: ID does not exist" containerID="85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.634665 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f"} err="failed to get container status \"85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f\": rpc error: code = NotFound desc = could not find container \"85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f\": container with ID starting with 85201ece789776f4c5f65dcf2d73da0354667a551d69174eb2e35813ba0b0d0f not found: ID does not exist" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.634685 4886 scope.go:117] "RemoveContainer" containerID="34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167" Mar 14 09:42:03 crc kubenswrapper[4886]: E0314 09:42:03.634976 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167\": container with ID starting with 34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167 not found: ID does not exist" containerID="34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.634992 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167"} err="failed to get container status \"34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167\": rpc error: code = NotFound desc = could not find container \"34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167\": container with ID starting with 34952165d98ecd57fb14ee3c83a74d07da6f8b158634bf4449e40a33cbb88167 not found: ID does not exist" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.635004 4886 scope.go:117] "RemoveContainer" containerID="6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4" Mar 14 09:42:03 crc kubenswrapper[4886]: E0314 09:42:03.635231 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4\": container with ID starting with 6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4 not found: ID does not exist" containerID="6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.635270 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4"} err="failed to get container status \"6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4\": rpc error: code = NotFound desc = could not find container \"6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4\": container with ID starting with 6b4976e86d39512df2a1e4f2a0aec903c50e58e9409d8f5d4b2bf3c04e580ca4 not found: ID does not exist" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.848633 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:03 crc kubenswrapper[4886]: I0314 09:42:03.949304 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kznmn\" (UniqueName: \"kubernetes.io/projected/a189f4a3-867c-46bb-9aa7-b4a434917b12-kube-api-access-kznmn\") pod \"a189f4a3-867c-46bb-9aa7-b4a434917b12\" (UID: \"a189f4a3-867c-46bb-9aa7-b4a434917b12\") " Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.528438 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558022-nfts2" event={"ID":"a189f4a3-867c-46bb-9aa7-b4a434917b12","Type":"ContainerDied","Data":"5372d0e7234a75194ac42b95055d26b24aa97cb75056c484378e5ade920f7c4d"} Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.528497 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5372d0e7234a75194ac42b95055d26b24aa97cb75056c484378e5ade920f7c4d" Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.528577 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558022-nfts2" Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.577949 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a189f4a3-867c-46bb-9aa7-b4a434917b12-kube-api-access-kznmn" (OuterVolumeSpecName: "kube-api-access-kznmn") pod "a189f4a3-867c-46bb-9aa7-b4a434917b12" (UID: "a189f4a3-867c-46bb-9aa7-b4a434917b12"). InnerVolumeSpecName "kube-api-access-kznmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.665302 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kznmn\" (UniqueName: \"kubernetes.io/projected/a189f4a3-867c-46bb-9aa7-b4a434917b12-kube-api-access-kznmn\") on node \"crc\" DevicePath \"\"" Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.925828 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-p5c7v"] Mar 14 09:42:04 crc kubenswrapper[4886]: I0314 09:42:04.936649 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558016-p5c7v"] Mar 14 09:42:05 crc kubenswrapper[4886]: I0314 09:42:05.435508 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0717bf13-48a0-403f-b4b5-9dbe68619319" path="/var/lib/kubelet/pods/0717bf13-48a0-403f-b4b5-9dbe68619319/volumes" Mar 14 09:42:05 crc kubenswrapper[4886]: I0314 09:42:05.436392 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" path="/var/lib/kubelet/pods/4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3/volumes" Mar 14 09:42:06 crc kubenswrapper[4886]: I0314 09:42:06.420925 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:42:06 crc kubenswrapper[4886]: E0314 09:42:06.421310 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:42:18 crc kubenswrapper[4886]: I0314 09:42:18.420766 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:42:18 crc kubenswrapper[4886]: E0314 09:42:18.421520 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:42:30 crc kubenswrapper[4886]: I0314 09:42:30.421549 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:42:30 crc kubenswrapper[4886]: E0314 09:42:30.422873 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:42:37 crc kubenswrapper[4886]: I0314 09:42:37.283481 4886 scope.go:117] "RemoveContainer" containerID="39056d6a10b78586b8207c24d576fbd2c758ff6c3b22b2bae058db21dfdf0f4d" Mar 14 09:42:42 crc kubenswrapper[4886]: I0314 09:42:42.420666 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:42:42 crc kubenswrapper[4886]: E0314 09:42:42.423171 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:42:53 crc kubenswrapper[4886]: I0314 09:42:53.422153 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:42:53 crc kubenswrapper[4886]: E0314 09:42:53.422941 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:43:04 crc kubenswrapper[4886]: I0314 09:43:04.420783 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:43:04 crc kubenswrapper[4886]: E0314 09:43:04.421908 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:43:18 crc kubenswrapper[4886]: I0314 09:43:18.421791 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:43:18 crc kubenswrapper[4886]: E0314 09:43:18.422790 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:43:29 crc kubenswrapper[4886]: I0314 09:43:29.421884 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:43:29 crc kubenswrapper[4886]: E0314 09:43:29.422701 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:43:41 crc kubenswrapper[4886]: I0314 09:43:41.421830 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:43:41 crc kubenswrapper[4886]: E0314 09:43:41.422858 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:43:52 crc kubenswrapper[4886]: I0314 09:43:52.421054 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:43:52 crc kubenswrapper[4886]: E0314 09:43:52.421803 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.157058 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558024-m7zjx"] Mar 14 09:44:00 crc kubenswrapper[4886]: E0314 09:44:00.157974 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a189f4a3-867c-46bb-9aa7-b4a434917b12" containerName="oc" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.157987 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a189f4a3-867c-46bb-9aa7-b4a434917b12" containerName="oc" Mar 14 09:44:00 crc kubenswrapper[4886]: E0314 09:44:00.158005 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="registry-server" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.158011 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="registry-server" Mar 14 09:44:00 crc kubenswrapper[4886]: E0314 09:44:00.158038 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="extract-utilities" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.158044 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="extract-utilities" Mar 14 09:44:00 crc kubenswrapper[4886]: E0314 09:44:00.158057 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="extract-content" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.158062 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="extract-content" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.158250 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce4890b-0ea6-4f52-b4ec-6fb9af599ed3" containerName="registry-server" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.158271 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a189f4a3-867c-46bb-9aa7-b4a434917b12" containerName="oc" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.159888 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.163401 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.163478 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.167247 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.179282 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-m7zjx"] Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.332071 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/c7334611-e628-4d27-b713-df373d7f6918-kube-api-access-nj6bh\") pod \"auto-csr-approver-29558024-m7zjx\" (UID: \"c7334611-e628-4d27-b713-df373d7f6918\") " pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.434181 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/c7334611-e628-4d27-b713-df373d7f6918-kube-api-access-nj6bh\") pod \"auto-csr-approver-29558024-m7zjx\" (UID: \"c7334611-e628-4d27-b713-df373d7f6918\") " pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.458732 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/c7334611-e628-4d27-b713-df373d7f6918-kube-api-access-nj6bh\") pod \"auto-csr-approver-29558024-m7zjx\" (UID: \"c7334611-e628-4d27-b713-df373d7f6918\") " pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:00 crc kubenswrapper[4886]: I0314 09:44:00.486993 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:01 crc kubenswrapper[4886]: I0314 09:44:01.062275 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-m7zjx"] Mar 14 09:44:01 crc kubenswrapper[4886]: I0314 09:44:01.774704 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" event={"ID":"c7334611-e628-4d27-b713-df373d7f6918","Type":"ContainerStarted","Data":"ed84dfc33cf3fd70b895b4fcad9318139388fe8dc1cb298001878b0b3c493308"} Mar 14 09:44:02 crc kubenswrapper[4886]: I0314 09:44:02.785652 4886 generic.go:334] "Generic (PLEG): container finished" podID="c7334611-e628-4d27-b713-df373d7f6918" containerID="b23ac0625d3d787e3a1eec3ab7236f8e7d96e6a1cdf4eee4275102b49b4c9bab" exitCode=0 Mar 14 09:44:02 crc kubenswrapper[4886]: I0314 09:44:02.785885 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" event={"ID":"c7334611-e628-4d27-b713-df373d7f6918","Type":"ContainerDied","Data":"b23ac0625d3d787e3a1eec3ab7236f8e7d96e6a1cdf4eee4275102b49b4c9bab"} Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.221625 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.331549 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/c7334611-e628-4d27-b713-df373d7f6918-kube-api-access-nj6bh\") pod \"c7334611-e628-4d27-b713-df373d7f6918\" (UID: \"c7334611-e628-4d27-b713-df373d7f6918\") " Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.346657 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7334611-e628-4d27-b713-df373d7f6918-kube-api-access-nj6bh" (OuterVolumeSpecName: "kube-api-access-nj6bh") pod "c7334611-e628-4d27-b713-df373d7f6918" (UID: "c7334611-e628-4d27-b713-df373d7f6918"). InnerVolumeSpecName "kube-api-access-nj6bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.435046 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6bh\" (UniqueName: \"kubernetes.io/projected/c7334611-e628-4d27-b713-df373d7f6918-kube-api-access-nj6bh\") on node \"crc\" DevicePath \"\"" Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.810914 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" event={"ID":"c7334611-e628-4d27-b713-df373d7f6918","Type":"ContainerDied","Data":"ed84dfc33cf3fd70b895b4fcad9318139388fe8dc1cb298001878b0b3c493308"} Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.811004 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558024-m7zjx" Mar 14 09:44:04 crc kubenswrapper[4886]: I0314 09:44:04.811013 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed84dfc33cf3fd70b895b4fcad9318139388fe8dc1cb298001878b0b3c493308" Mar 14 09:44:05 crc kubenswrapper[4886]: I0314 09:44:05.308072 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-lnv6t"] Mar 14 09:44:05 crc kubenswrapper[4886]: I0314 09:44:05.316689 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558018-lnv6t"] Mar 14 09:44:05 crc kubenswrapper[4886]: I0314 09:44:05.431280 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd596c5c-1c7d-4937-8597-1b17084831b1" path="/var/lib/kubelet/pods/bd596c5c-1c7d-4937-8597-1b17084831b1/volumes" Mar 14 09:44:07 crc kubenswrapper[4886]: I0314 09:44:07.421429 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:44:07 crc kubenswrapper[4886]: E0314 09:44:07.422040 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:44:18 crc kubenswrapper[4886]: I0314 09:44:18.421242 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:44:18 crc kubenswrapper[4886]: E0314 09:44:18.422088 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:44:32 crc kubenswrapper[4886]: I0314 09:44:32.421908 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:44:32 crc kubenswrapper[4886]: E0314 09:44:32.423055 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:44:37 crc kubenswrapper[4886]: I0314 09:44:37.422041 4886 scope.go:117] "RemoveContainer" containerID="24d3977b769b29ed5d0411dbac3ea84d790cbdded345cafabda6a2e3c47eb24c" Mar 14 09:44:45 crc kubenswrapper[4886]: I0314 09:44:45.429396 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:44:45 crc kubenswrapper[4886]: E0314 09:44:45.430439 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:44:57 crc kubenswrapper[4886]: I0314 09:44:57.420355 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:44:58 crc kubenswrapper[4886]: I0314 09:44:58.393569 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"7d0163d62d22d8130619611a4bc49f9ddca1c38d62e0385537e914e59680ef97"} Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.150175 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj"] Mar 14 09:45:00 crc kubenswrapper[4886]: E0314 09:45:00.151507 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7334611-e628-4d27-b713-df373d7f6918" containerName="oc" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.151528 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7334611-e628-4d27-b713-df373d7f6918" containerName="oc" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.151789 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7334611-e628-4d27-b713-df373d7f6918" containerName="oc" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.152680 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.154345 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.154442 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.164831 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj"] Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.291687 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-secret-volume\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.292188 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-config-volume\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.292326 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vw4q\" (UniqueName: \"kubernetes.io/projected/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-kube-api-access-7vw4q\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.394149 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-secret-volume\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.394565 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-config-volume\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.394596 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vw4q\" (UniqueName: \"kubernetes.io/projected/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-kube-api-access-7vw4q\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.395851 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-config-volume\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.401485 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-secret-volume\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.428901 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vw4q\" (UniqueName: \"kubernetes.io/projected/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-kube-api-access-7vw4q\") pod \"collect-profiles-29558025-9bkgj\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.472928 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:00 crc kubenswrapper[4886]: I0314 09:45:00.915443 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj"] Mar 14 09:45:01 crc kubenswrapper[4886]: I0314 09:45:01.424425 4886 generic.go:334] "Generic (PLEG): container finished" podID="cb4de91b-9f87-463a-bc4e-41203e9eb3b8" containerID="d103f7bcf4306c47dd9779e4969611e3b6d9dbc95559cf6c62f35fd3ba47a377" exitCode=0 Mar 14 09:45:01 crc kubenswrapper[4886]: I0314 09:45:01.433001 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" event={"ID":"cb4de91b-9f87-463a-bc4e-41203e9eb3b8","Type":"ContainerDied","Data":"d103f7bcf4306c47dd9779e4969611e3b6d9dbc95559cf6c62f35fd3ba47a377"} Mar 14 09:45:01 crc kubenswrapper[4886]: I0314 09:45:01.433070 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" event={"ID":"cb4de91b-9f87-463a-bc4e-41203e9eb3b8","Type":"ContainerStarted","Data":"300c5c6d654f1f4c75237d4a7a94c52f0ca598c0a1ba0858484f881719859b31"} Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.843763 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.957898 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vw4q\" (UniqueName: \"kubernetes.io/projected/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-kube-api-access-7vw4q\") pod \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.958321 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-config-volume\") pod \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.958508 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-secret-volume\") pod \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\" (UID: \"cb4de91b-9f87-463a-bc4e-41203e9eb3b8\") " Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.959603 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb4de91b-9f87-463a-bc4e-41203e9eb3b8" (UID: "cb4de91b-9f87-463a-bc4e-41203e9eb3b8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.970501 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-kube-api-access-7vw4q" (OuterVolumeSpecName: "kube-api-access-7vw4q") pod "cb4de91b-9f87-463a-bc4e-41203e9eb3b8" (UID: "cb4de91b-9f87-463a-bc4e-41203e9eb3b8"). InnerVolumeSpecName "kube-api-access-7vw4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:45:02 crc kubenswrapper[4886]: I0314 09:45:02.983286 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb4de91b-9f87-463a-bc4e-41203e9eb3b8" (UID: "cb4de91b-9f87-463a-bc4e-41203e9eb3b8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.060746 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vw4q\" (UniqueName: \"kubernetes.io/projected/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-kube-api-access-7vw4q\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.060786 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.060798 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb4de91b-9f87-463a-bc4e-41203e9eb3b8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.451475 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" event={"ID":"cb4de91b-9f87-463a-bc4e-41203e9eb3b8","Type":"ContainerDied","Data":"300c5c6d654f1f4c75237d4a7a94c52f0ca598c0a1ba0858484f881719859b31"} Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.451537 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300c5c6d654f1f4c75237d4a7a94c52f0ca598c0a1ba0858484f881719859b31" Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.451540 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558025-9bkgj" Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.931940 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc"] Mar 14 09:45:03 crc kubenswrapper[4886]: I0314 09:45:03.947622 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557980-6cddc"] Mar 14 09:45:05 crc kubenswrapper[4886]: I0314 09:45:05.445406 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c9a265-82e9-4726-bda9-f6c6111dc1dc" path="/var/lib/kubelet/pods/e3c9a265-82e9-4726-bda9-f6c6111dc1dc/volumes" Mar 14 09:45:37 crc kubenswrapper[4886]: I0314 09:45:37.517701 4886 scope.go:117] "RemoveContainer" containerID="973350fda5424a10b0f4c4bb8ac5973c1b6c73e577bfc8465011f28d3db4f4f6" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.151223 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558026-zkkts"] Mar 14 09:46:00 crc kubenswrapper[4886]: E0314 09:46:00.152308 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4de91b-9f87-463a-bc4e-41203e9eb3b8" containerName="collect-profiles" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.152324 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4de91b-9f87-463a-bc4e-41203e9eb3b8" containerName="collect-profiles" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.152570 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4de91b-9f87-463a-bc4e-41203e9eb3b8" containerName="collect-profiles" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.153413 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.156089 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.156131 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.156159 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.157609 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-zkkts"] Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.218104 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkh75\" (UniqueName: \"kubernetes.io/projected/742dc010-adf1-4c5a-ade4-dd961148b6b7-kube-api-access-pkh75\") pod \"auto-csr-approver-29558026-zkkts\" (UID: \"742dc010-adf1-4c5a-ade4-dd961148b6b7\") " pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.319347 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkh75\" (UniqueName: \"kubernetes.io/projected/742dc010-adf1-4c5a-ade4-dd961148b6b7-kube-api-access-pkh75\") pod \"auto-csr-approver-29558026-zkkts\" (UID: \"742dc010-adf1-4c5a-ade4-dd961148b6b7\") " pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.339882 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkh75\" (UniqueName: \"kubernetes.io/projected/742dc010-adf1-4c5a-ade4-dd961148b6b7-kube-api-access-pkh75\") pod \"auto-csr-approver-29558026-zkkts\" (UID: \"742dc010-adf1-4c5a-ade4-dd961148b6b7\") " pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.475385 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.937984 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-zkkts"] Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.948086 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:46:00 crc kubenswrapper[4886]: I0314 09:46:00.982782 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-zkkts" event={"ID":"742dc010-adf1-4c5a-ade4-dd961148b6b7","Type":"ContainerStarted","Data":"d0609f96ed5ccc2fdb155a5330cc9b685833ed3c2a527672273eb773a36e0032"} Mar 14 09:46:03 crc kubenswrapper[4886]: I0314 09:46:03.000765 4886 generic.go:334] "Generic (PLEG): container finished" podID="742dc010-adf1-4c5a-ade4-dd961148b6b7" containerID="d31c5bf949c0cf07947cf473a65c5ebb6e510151260cf0439cbec88718175683" exitCode=0 Mar 14 09:46:03 crc kubenswrapper[4886]: I0314 09:46:03.000830 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-zkkts" event={"ID":"742dc010-adf1-4c5a-ade4-dd961148b6b7","Type":"ContainerDied","Data":"d31c5bf949c0cf07947cf473a65c5ebb6e510151260cf0439cbec88718175683"} Mar 14 09:46:04 crc kubenswrapper[4886]: I0314 09:46:04.441553 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:04 crc kubenswrapper[4886]: I0314 09:46:04.506900 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkh75\" (UniqueName: \"kubernetes.io/projected/742dc010-adf1-4c5a-ade4-dd961148b6b7-kube-api-access-pkh75\") pod \"742dc010-adf1-4c5a-ade4-dd961148b6b7\" (UID: \"742dc010-adf1-4c5a-ade4-dd961148b6b7\") " Mar 14 09:46:04 crc kubenswrapper[4886]: I0314 09:46:04.512488 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742dc010-adf1-4c5a-ade4-dd961148b6b7-kube-api-access-pkh75" (OuterVolumeSpecName: "kube-api-access-pkh75") pod "742dc010-adf1-4c5a-ade4-dd961148b6b7" (UID: "742dc010-adf1-4c5a-ade4-dd961148b6b7"). InnerVolumeSpecName "kube-api-access-pkh75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:46:04 crc kubenswrapper[4886]: I0314 09:46:04.608808 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkh75\" (UniqueName: \"kubernetes.io/projected/742dc010-adf1-4c5a-ade4-dd961148b6b7-kube-api-access-pkh75\") on node \"crc\" DevicePath \"\"" Mar 14 09:46:05 crc kubenswrapper[4886]: I0314 09:46:05.026251 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558026-zkkts" event={"ID":"742dc010-adf1-4c5a-ade4-dd961148b6b7","Type":"ContainerDied","Data":"d0609f96ed5ccc2fdb155a5330cc9b685833ed3c2a527672273eb773a36e0032"} Mar 14 09:46:05 crc kubenswrapper[4886]: I0314 09:46:05.026549 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0609f96ed5ccc2fdb155a5330cc9b685833ed3c2a527672273eb773a36e0032" Mar 14 09:46:05 crc kubenswrapper[4886]: I0314 09:46:05.026604 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558026-zkkts" Mar 14 09:46:05 crc kubenswrapper[4886]: I0314 09:46:05.525555 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-ddgfg"] Mar 14 09:46:05 crc kubenswrapper[4886]: I0314 09:46:05.533419 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558020-ddgfg"] Mar 14 09:46:07 crc kubenswrapper[4886]: I0314 09:46:07.431691 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ceb013-9172-4aef-a7fd-2cebdf3f7d04" path="/var/lib/kubelet/pods/b8ceb013-9172-4aef-a7fd-2cebdf3f7d04/volumes" Mar 14 09:46:37 crc kubenswrapper[4886]: I0314 09:46:37.630623 4886 scope.go:117] "RemoveContainer" containerID="e7006e1c2a7612d626f174bb6e1137c96ae29c0e177b327bd0873ca31d7c9c7a" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.687180 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z4xzv"] Mar 14 09:47:14 crc kubenswrapper[4886]: E0314 09:47:14.688735 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742dc010-adf1-4c5a-ade4-dd961148b6b7" containerName="oc" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.688754 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="742dc010-adf1-4c5a-ade4-dd961148b6b7" containerName="oc" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.689030 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="742dc010-adf1-4c5a-ade4-dd961148b6b7" containerName="oc" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.690938 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.706842 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4xzv"] Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.760655 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-catalog-content\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.760731 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmq2\" (UniqueName: \"kubernetes.io/projected/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-kube-api-access-6tmq2\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.760864 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-utilities\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.862344 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-utilities\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.862549 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-catalog-content\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.862576 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmq2\" (UniqueName: \"kubernetes.io/projected/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-kube-api-access-6tmq2\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.862852 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-utilities\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.863054 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-catalog-content\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:14 crc kubenswrapper[4886]: I0314 09:47:14.880813 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmq2\" (UniqueName: \"kubernetes.io/projected/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-kube-api-access-6tmq2\") pod \"certified-operators-z4xzv\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:15 crc kubenswrapper[4886]: I0314 09:47:15.014305 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:15 crc kubenswrapper[4886]: I0314 09:47:15.622235 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4xzv"] Mar 14 09:47:15 crc kubenswrapper[4886]: I0314 09:47:15.735801 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerStarted","Data":"78c9bc9ce5ea931ee69b536923b9b72988883c1d3e235e2731d99658444a3ae8"} Mar 14 09:47:16 crc kubenswrapper[4886]: I0314 09:47:16.750353 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerID="4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f" exitCode=0 Mar 14 09:47:16 crc kubenswrapper[4886]: I0314 09:47:16.750428 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerDied","Data":"4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f"} Mar 14 09:47:17 crc kubenswrapper[4886]: I0314 09:47:17.777102 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerStarted","Data":"72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6"} Mar 14 09:47:18 crc kubenswrapper[4886]: I0314 09:47:18.788676 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerID="72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6" exitCode=0 Mar 14 09:47:18 crc kubenswrapper[4886]: I0314 09:47:18.788736 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerDied","Data":"72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6"} Mar 14 09:47:20 crc kubenswrapper[4886]: I0314 09:47:20.813022 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerStarted","Data":"7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308"} Mar 14 09:47:20 crc kubenswrapper[4886]: I0314 09:47:20.841135 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z4xzv" podStartSLOduration=4.397347244 podStartE2EDuration="6.841097707s" podCreationTimestamp="2026-03-14 09:47:14 +0000 UTC" firstStartedPulling="2026-03-14 09:47:16.753047566 +0000 UTC m=+4772.001499203" lastFinishedPulling="2026-03-14 09:47:19.196797989 +0000 UTC m=+4774.445249666" observedRunningTime="2026-03-14 09:47:20.831571406 +0000 UTC m=+4776.080023063" watchObservedRunningTime="2026-03-14 09:47:20.841097707 +0000 UTC m=+4776.089549344" Mar 14 09:47:25 crc kubenswrapper[4886]: I0314 09:47:25.014666 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:25 crc kubenswrapper[4886]: I0314 09:47:25.015372 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:25 crc kubenswrapper[4886]: I0314 09:47:25.064349 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:25 crc kubenswrapper[4886]: I0314 09:47:25.905990 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:25 crc kubenswrapper[4886]: I0314 09:47:25.957411 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4xzv"] Mar 14 09:47:26 crc kubenswrapper[4886]: I0314 09:47:26.065718 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:47:26 crc kubenswrapper[4886]: I0314 09:47:26.066078 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:47:27 crc kubenswrapper[4886]: I0314 09:47:27.874051 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z4xzv" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="registry-server" containerID="cri-o://7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308" gracePeriod=2 Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.392164 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.551793 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-catalog-content\") pod \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.551860 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tmq2\" (UniqueName: \"kubernetes.io/projected/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-kube-api-access-6tmq2\") pod \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.551933 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-utilities\") pod \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\" (UID: \"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980\") " Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.553102 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-utilities" (OuterVolumeSpecName: "utilities") pod "a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" (UID: "a1b3bdb5-fa04-42ce-bea0-c6597e8cd980"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.562375 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-kube-api-access-6tmq2" (OuterVolumeSpecName: "kube-api-access-6tmq2") pod "a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" (UID: "a1b3bdb5-fa04-42ce-bea0-c6597e8cd980"). InnerVolumeSpecName "kube-api-access-6tmq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.598419 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" (UID: "a1b3bdb5-fa04-42ce-bea0-c6597e8cd980"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.654020 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.654334 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tmq2\" (UniqueName: \"kubernetes.io/projected/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-kube-api-access-6tmq2\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.654348 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.885461 4886 generic.go:334] "Generic (PLEG): container finished" podID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerID="7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308" exitCode=0 Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.885507 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerDied","Data":"7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308"} Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.885534 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4xzv" event={"ID":"a1b3bdb5-fa04-42ce-bea0-c6597e8cd980","Type":"ContainerDied","Data":"78c9bc9ce5ea931ee69b536923b9b72988883c1d3e235e2731d99658444a3ae8"} Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.885533 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4xzv" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.885558 4886 scope.go:117] "RemoveContainer" containerID="7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.924418 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4xzv"] Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.933607 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z4xzv"] Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.935966 4886 scope.go:117] "RemoveContainer" containerID="72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6" Mar 14 09:47:28 crc kubenswrapper[4886]: I0314 09:47:28.975539 4886 scope.go:117] "RemoveContainer" containerID="4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.011460 4886 scope.go:117] "RemoveContainer" containerID="7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308" Mar 14 09:47:29 crc kubenswrapper[4886]: E0314 09:47:29.011944 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308\": container with ID starting with 7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308 not found: ID does not exist" containerID="7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.011996 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308"} err="failed to get container status \"7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308\": rpc error: code = NotFound desc = could not find container \"7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308\": container with ID starting with 7628857e7703629f796283831b1892d24234b3ee6227e70bc1837422ea3a2308 not found: ID does not exist" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.012030 4886 scope.go:117] "RemoveContainer" containerID="72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6" Mar 14 09:47:29 crc kubenswrapper[4886]: E0314 09:47:29.012584 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6\": container with ID starting with 72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6 not found: ID does not exist" containerID="72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.012651 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6"} err="failed to get container status \"72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6\": rpc error: code = NotFound desc = could not find container \"72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6\": container with ID starting with 72c6918f5a81a48321f5207657d16cc64d9cffa1bde91d129e462ce698a000e6 not found: ID does not exist" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.012684 4886 scope.go:117] "RemoveContainer" containerID="4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f" Mar 14 09:47:29 crc kubenswrapper[4886]: E0314 09:47:29.013154 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f\": container with ID starting with 4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f not found: ID does not exist" containerID="4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.013173 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f"} err="failed to get container status \"4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f\": rpc error: code = NotFound desc = could not find container \"4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f\": container with ID starting with 4ea2d2678db3ea1ce3f934d3d3964644ec0d1b03544364324a8bc033db8a605f not found: ID does not exist" Mar 14 09:47:29 crc kubenswrapper[4886]: I0314 09:47:29.439648 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" path="/var/lib/kubelet/pods/a1b3bdb5-fa04-42ce-bea0-c6597e8cd980/volumes" Mar 14 09:47:56 crc kubenswrapper[4886]: I0314 09:47:56.065995 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:47:56 crc kubenswrapper[4886]: I0314 09:47:56.066743 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.164784 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558028-v7hqx"] Mar 14 09:48:00 crc kubenswrapper[4886]: E0314 09:48:00.165899 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="extract-utilities" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.165914 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="extract-utilities" Mar 14 09:48:00 crc kubenswrapper[4886]: E0314 09:48:00.165932 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="registry-server" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.165940 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="registry-server" Mar 14 09:48:00 crc kubenswrapper[4886]: E0314 09:48:00.165966 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="extract-content" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.165973 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="extract-content" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.166177 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b3bdb5-fa04-42ce-bea0-c6597e8cd980" containerName="registry-server" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.166920 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.169557 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.169837 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.171020 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.182357 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-v7hqx"] Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.266914 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jzm\" (UniqueName: \"kubernetes.io/projected/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1-kube-api-access-v2jzm\") pod \"auto-csr-approver-29558028-v7hqx\" (UID: \"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1\") " pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.369831 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jzm\" (UniqueName: \"kubernetes.io/projected/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1-kube-api-access-v2jzm\") pod \"auto-csr-approver-29558028-v7hqx\" (UID: \"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1\") " pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.389660 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jzm\" (UniqueName: \"kubernetes.io/projected/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1-kube-api-access-v2jzm\") pod \"auto-csr-approver-29558028-v7hqx\" (UID: \"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1\") " pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.488478 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:00 crc kubenswrapper[4886]: I0314 09:48:00.946099 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-v7hqx"] Mar 14 09:48:01 crc kubenswrapper[4886]: I0314 09:48:01.229345 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" event={"ID":"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1","Type":"ContainerStarted","Data":"2484277cb36e20d4198ecd03bf2bdc220f1751837575cd81a80265b64c285267"} Mar 14 09:48:02 crc kubenswrapper[4886]: I0314 09:48:02.240776 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" event={"ID":"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1","Type":"ContainerStarted","Data":"7fc74959576499b1935c562a3461ec76a62eae9c5ee822da85b97a77ae866a33"} Mar 14 09:48:02 crc kubenswrapper[4886]: I0314 09:48:02.265625 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" podStartSLOduration=1.472401778 podStartE2EDuration="2.265603773s" podCreationTimestamp="2026-03-14 09:48:00 +0000 UTC" firstStartedPulling="2026-03-14 09:48:00.958555626 +0000 UTC m=+4816.207007253" lastFinishedPulling="2026-03-14 09:48:01.751757611 +0000 UTC m=+4817.000209248" observedRunningTime="2026-03-14 09:48:02.25673947 +0000 UTC m=+4817.505191137" watchObservedRunningTime="2026-03-14 09:48:02.265603773 +0000 UTC m=+4817.514055430" Mar 14 09:48:03 crc kubenswrapper[4886]: I0314 09:48:03.253614 4886 generic.go:334] "Generic (PLEG): container finished" podID="5bcf77a7-3bd8-492a-bed5-e65b2f8311e1" containerID="7fc74959576499b1935c562a3461ec76a62eae9c5ee822da85b97a77ae866a33" exitCode=0 Mar 14 09:48:03 crc kubenswrapper[4886]: I0314 09:48:03.253684 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" event={"ID":"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1","Type":"ContainerDied","Data":"7fc74959576499b1935c562a3461ec76a62eae9c5ee822da85b97a77ae866a33"} Mar 14 09:48:04 crc kubenswrapper[4886]: I0314 09:48:04.716760 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:04 crc kubenswrapper[4886]: I0314 09:48:04.784084 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2jzm\" (UniqueName: \"kubernetes.io/projected/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1-kube-api-access-v2jzm\") pod \"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1\" (UID: \"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1\") " Mar 14 09:48:04 crc kubenswrapper[4886]: I0314 09:48:04.790432 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1-kube-api-access-v2jzm" (OuterVolumeSpecName: "kube-api-access-v2jzm") pod "5bcf77a7-3bd8-492a-bed5-e65b2f8311e1" (UID: "5bcf77a7-3bd8-492a-bed5-e65b2f8311e1"). InnerVolumeSpecName "kube-api-access-v2jzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:48:04 crc kubenswrapper[4886]: I0314 09:48:04.887344 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2jzm\" (UniqueName: \"kubernetes.io/projected/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1-kube-api-access-v2jzm\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:05 crc kubenswrapper[4886]: I0314 09:48:05.277927 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" event={"ID":"5bcf77a7-3bd8-492a-bed5-e65b2f8311e1","Type":"ContainerDied","Data":"2484277cb36e20d4198ecd03bf2bdc220f1751837575cd81a80265b64c285267"} Mar 14 09:48:05 crc kubenswrapper[4886]: I0314 09:48:05.278541 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2484277cb36e20d4198ecd03bf2bdc220f1751837575cd81a80265b64c285267" Mar 14 09:48:05 crc kubenswrapper[4886]: I0314 09:48:05.277968 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558028-v7hqx" Mar 14 09:48:05 crc kubenswrapper[4886]: I0314 09:48:05.372657 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-nfts2"] Mar 14 09:48:05 crc kubenswrapper[4886]: I0314 09:48:05.380513 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558022-nfts2"] Mar 14 09:48:05 crc kubenswrapper[4886]: I0314 09:48:05.431452 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a189f4a3-867c-46bb-9aa7-b4a434917b12" path="/var/lib/kubelet/pods/a189f4a3-867c-46bb-9aa7-b4a434917b12/volumes" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.204257 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27t7b"] Mar 14 09:48:20 crc kubenswrapper[4886]: E0314 09:48:20.205103 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf77a7-3bd8-492a-bed5-e65b2f8311e1" containerName="oc" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.205138 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf77a7-3bd8-492a-bed5-e65b2f8311e1" containerName="oc" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.205353 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcf77a7-3bd8-492a-bed5-e65b2f8311e1" containerName="oc" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.206755 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.227033 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27t7b"] Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.236368 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-utilities\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.236506 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-catalog-content\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.236540 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phbs4\" (UniqueName: \"kubernetes.io/projected/aab0d41d-2ce0-4565-afd2-47259a2c232b-kube-api-access-phbs4\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.338141 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-catalog-content\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.338198 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phbs4\" (UniqueName: \"kubernetes.io/projected/aab0d41d-2ce0-4565-afd2-47259a2c232b-kube-api-access-phbs4\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.338282 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-utilities\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.338665 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-catalog-content\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.338737 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-utilities\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.357548 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phbs4\" (UniqueName: \"kubernetes.io/projected/aab0d41d-2ce0-4565-afd2-47259a2c232b-kube-api-access-phbs4\") pod \"redhat-marketplace-27t7b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:20 crc kubenswrapper[4886]: I0314 09:48:20.526228 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:21 crc kubenswrapper[4886]: I0314 09:48:21.013933 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27t7b"] Mar 14 09:48:21 crc kubenswrapper[4886]: I0314 09:48:21.451142 4886 generic.go:334] "Generic (PLEG): container finished" podID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerID="442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528" exitCode=0 Mar 14 09:48:21 crc kubenswrapper[4886]: I0314 09:48:21.453489 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerDied","Data":"442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528"} Mar 14 09:48:21 crc kubenswrapper[4886]: I0314 09:48:21.453652 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerStarted","Data":"05c86cac2adbfc1923b8a8d3eabf7c4fd8c810d7ba870797448521fdb3764635"} Mar 14 09:48:22 crc kubenswrapper[4886]: I0314 09:48:22.467697 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerStarted","Data":"d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011"} Mar 14 09:48:23 crc kubenswrapper[4886]: I0314 09:48:23.483236 4886 generic.go:334] "Generic (PLEG): container finished" podID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerID="d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011" exitCode=0 Mar 14 09:48:23 crc kubenswrapper[4886]: I0314 09:48:23.483318 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerDied","Data":"d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011"} Mar 14 09:48:24 crc kubenswrapper[4886]: I0314 09:48:24.499335 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerStarted","Data":"3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184"} Mar 14 09:48:24 crc kubenswrapper[4886]: I0314 09:48:24.528698 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27t7b" podStartSLOduration=2.088354213 podStartE2EDuration="4.528673528s" podCreationTimestamp="2026-03-14 09:48:20 +0000 UTC" firstStartedPulling="2026-03-14 09:48:21.455990487 +0000 UTC m=+4836.704442134" lastFinishedPulling="2026-03-14 09:48:23.896309812 +0000 UTC m=+4839.144761449" observedRunningTime="2026-03-14 09:48:24.520640499 +0000 UTC m=+4839.769092136" watchObservedRunningTime="2026-03-14 09:48:24.528673528 +0000 UTC m=+4839.777125165" Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.066472 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.066857 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.066904 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.067746 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d0163d62d22d8130619611a4bc49f9ddca1c38d62e0385537e914e59680ef97"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.067797 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://7d0163d62d22d8130619611a4bc49f9ddca1c38d62e0385537e914e59680ef97" gracePeriod=600 Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.524001 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="7d0163d62d22d8130619611a4bc49f9ddca1c38d62e0385537e914e59680ef97" exitCode=0 Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.524073 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"7d0163d62d22d8130619611a4bc49f9ddca1c38d62e0385537e914e59680ef97"} Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.524741 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f"} Mar 14 09:48:26 crc kubenswrapper[4886]: I0314 09:48:26.524828 4886 scope.go:117] "RemoveContainer" containerID="5ed5b5f60a1a4c5ad91e86e1ead33b0b8fe14161be79a015dfb47d44289e7b52" Mar 14 09:48:30 crc kubenswrapper[4886]: I0314 09:48:30.527172 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:30 crc kubenswrapper[4886]: I0314 09:48:30.528039 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:30 crc kubenswrapper[4886]: I0314 09:48:30.612229 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:32 crc kubenswrapper[4886]: I0314 09:48:32.437020 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:32 crc kubenswrapper[4886]: I0314 09:48:32.506662 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27t7b"] Mar 14 09:48:33 crc kubenswrapper[4886]: I0314 09:48:33.610838 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27t7b" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="registry-server" containerID="cri-o://3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184" gracePeriod=2 Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.111018 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.278234 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phbs4\" (UniqueName: \"kubernetes.io/projected/aab0d41d-2ce0-4565-afd2-47259a2c232b-kube-api-access-phbs4\") pod \"aab0d41d-2ce0-4565-afd2-47259a2c232b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.278500 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-catalog-content\") pod \"aab0d41d-2ce0-4565-afd2-47259a2c232b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.278545 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-utilities\") pod \"aab0d41d-2ce0-4565-afd2-47259a2c232b\" (UID: \"aab0d41d-2ce0-4565-afd2-47259a2c232b\") " Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.279581 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-utilities" (OuterVolumeSpecName: "utilities") pod "aab0d41d-2ce0-4565-afd2-47259a2c232b" (UID: "aab0d41d-2ce0-4565-afd2-47259a2c232b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.285340 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab0d41d-2ce0-4565-afd2-47259a2c232b-kube-api-access-phbs4" (OuterVolumeSpecName: "kube-api-access-phbs4") pod "aab0d41d-2ce0-4565-afd2-47259a2c232b" (UID: "aab0d41d-2ce0-4565-afd2-47259a2c232b"). InnerVolumeSpecName "kube-api-access-phbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.305378 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aab0d41d-2ce0-4565-afd2-47259a2c232b" (UID: "aab0d41d-2ce0-4565-afd2-47259a2c232b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.381102 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.381158 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab0d41d-2ce0-4565-afd2-47259a2c232b-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.381173 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phbs4\" (UniqueName: \"kubernetes.io/projected/aab0d41d-2ce0-4565-afd2-47259a2c232b-kube-api-access-phbs4\") on node \"crc\" DevicePath \"\"" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.625009 4886 generic.go:334] "Generic (PLEG): container finished" podID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerID="3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184" exitCode=0 Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.625062 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerDied","Data":"3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184"} Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.625092 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27t7b" event={"ID":"aab0d41d-2ce0-4565-afd2-47259a2c232b","Type":"ContainerDied","Data":"05c86cac2adbfc1923b8a8d3eabf7c4fd8c810d7ba870797448521fdb3764635"} Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.625114 4886 scope.go:117] "RemoveContainer" containerID="3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.626685 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27t7b" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.652417 4886 scope.go:117] "RemoveContainer" containerID="d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.681352 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27t7b"] Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.695816 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27t7b"] Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.709511 4886 scope.go:117] "RemoveContainer" containerID="442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.765261 4886 scope.go:117] "RemoveContainer" containerID="3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184" Mar 14 09:48:34 crc kubenswrapper[4886]: E0314 09:48:34.765878 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184\": container with ID starting with 3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184 not found: ID does not exist" containerID="3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.765986 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184"} err="failed to get container status \"3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184\": rpc error: code = NotFound desc = could not find container \"3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184\": container with ID starting with 3190e76da91f763622ce49a229ff7a2e41a4383a20d6c0fc0c53035d1e608184 not found: ID does not exist" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.766068 4886 scope.go:117] "RemoveContainer" containerID="d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011" Mar 14 09:48:34 crc kubenswrapper[4886]: E0314 09:48:34.766421 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011\": container with ID starting with d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011 not found: ID does not exist" containerID="d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.766528 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011"} err="failed to get container status \"d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011\": rpc error: code = NotFound desc = could not find container \"d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011\": container with ID starting with d1c100ef0af725c178e06a0d2fda6ac5e14d3c254a8ee2804ccad66c28e7c011 not found: ID does not exist" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.766604 4886 scope.go:117] "RemoveContainer" containerID="442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528" Mar 14 09:48:34 crc kubenswrapper[4886]: E0314 09:48:34.766906 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528\": container with ID starting with 442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528 not found: ID does not exist" containerID="442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528" Mar 14 09:48:34 crc kubenswrapper[4886]: I0314 09:48:34.767005 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528"} err="failed to get container status \"442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528\": rpc error: code = NotFound desc = could not find container \"442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528\": container with ID starting with 442266d3aea855505d707ede041569b256c221cde5d5ae1c824d8a252d5f0528 not found: ID does not exist" Mar 14 09:48:35 crc kubenswrapper[4886]: I0314 09:48:35.433731 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" path="/var/lib/kubelet/pods/aab0d41d-2ce0-4565-afd2-47259a2c232b/volumes" Mar 14 09:48:37 crc kubenswrapper[4886]: I0314 09:48:37.746727 4886 scope.go:117] "RemoveContainer" containerID="99ac067a5e543929b824d308c26a57bd05a594e91bd658d2c97b1ce004a10aec" Mar 14 09:49:09 crc kubenswrapper[4886]: I0314 09:49:09.510541 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-768658d555-ttc2f" podUID="69b57a2a-532a-4728-89d6-090f17edc7a7" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.144546 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558030-5fj7d"] Mar 14 09:50:00 crc kubenswrapper[4886]: E0314 09:50:00.145432 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="registry-server" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.145447 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="registry-server" Mar 14 09:50:00 crc kubenswrapper[4886]: E0314 09:50:00.145460 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="extract-utilities" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.145468 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="extract-utilities" Mar 14 09:50:00 crc kubenswrapper[4886]: E0314 09:50:00.145484 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="extract-content" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.145491 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="extract-content" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.145661 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab0d41d-2ce0-4565-afd2-47259a2c232b" containerName="registry-server" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.150651 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.152949 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.153018 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.153069 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.173822 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-5fj7d"] Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.281611 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpc7z\" (UniqueName: \"kubernetes.io/projected/93d25cee-7775-463f-b63f-3bf9e5b91e9b-kube-api-access-bpc7z\") pod \"auto-csr-approver-29558030-5fj7d\" (UID: \"93d25cee-7775-463f-b63f-3bf9e5b91e9b\") " pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.384958 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpc7z\" (UniqueName: \"kubernetes.io/projected/93d25cee-7775-463f-b63f-3bf9e5b91e9b-kube-api-access-bpc7z\") pod \"auto-csr-approver-29558030-5fj7d\" (UID: \"93d25cee-7775-463f-b63f-3bf9e5b91e9b\") " pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.414232 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpc7z\" (UniqueName: \"kubernetes.io/projected/93d25cee-7775-463f-b63f-3bf9e5b91e9b-kube-api-access-bpc7z\") pod \"auto-csr-approver-29558030-5fj7d\" (UID: \"93d25cee-7775-463f-b63f-3bf9e5b91e9b\") " pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.471285 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:00 crc kubenswrapper[4886]: I0314 09:50:00.919912 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-5fj7d"] Mar 14 09:50:01 crc kubenswrapper[4886]: I0314 09:50:01.580334 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" event={"ID":"93d25cee-7775-463f-b63f-3bf9e5b91e9b","Type":"ContainerStarted","Data":"1ca08c51f21444ddd53d96a704c0d150eead028217cd0658b45c7ffbf40e6bbc"} Mar 14 09:50:02 crc kubenswrapper[4886]: I0314 09:50:02.589754 4886 generic.go:334] "Generic (PLEG): container finished" podID="93d25cee-7775-463f-b63f-3bf9e5b91e9b" containerID="d116af96eb5cceca38eee84a0eac7e8ea5641140a3c915c2304eb79d1806d979" exitCode=0 Mar 14 09:50:02 crc kubenswrapper[4886]: I0314 09:50:02.589863 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" event={"ID":"93d25cee-7775-463f-b63f-3bf9e5b91e9b","Type":"ContainerDied","Data":"d116af96eb5cceca38eee84a0eac7e8ea5641140a3c915c2304eb79d1806d979"} Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.052657 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.155784 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpc7z\" (UniqueName: \"kubernetes.io/projected/93d25cee-7775-463f-b63f-3bf9e5b91e9b-kube-api-access-bpc7z\") pod \"93d25cee-7775-463f-b63f-3bf9e5b91e9b\" (UID: \"93d25cee-7775-463f-b63f-3bf9e5b91e9b\") " Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.162512 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d25cee-7775-463f-b63f-3bf9e5b91e9b-kube-api-access-bpc7z" (OuterVolumeSpecName: "kube-api-access-bpc7z") pod "93d25cee-7775-463f-b63f-3bf9e5b91e9b" (UID: "93d25cee-7775-463f-b63f-3bf9e5b91e9b"). InnerVolumeSpecName "kube-api-access-bpc7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.258187 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpc7z\" (UniqueName: \"kubernetes.io/projected/93d25cee-7775-463f-b63f-3bf9e5b91e9b-kube-api-access-bpc7z\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.613826 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" event={"ID":"93d25cee-7775-463f-b63f-3bf9e5b91e9b","Type":"ContainerDied","Data":"1ca08c51f21444ddd53d96a704c0d150eead028217cd0658b45c7ffbf40e6bbc"} Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.614084 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ca08c51f21444ddd53d96a704c0d150eead028217cd0658b45c7ffbf40e6bbc" Mar 14 09:50:04 crc kubenswrapper[4886]: I0314 09:50:04.613862 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558030-5fj7d" Mar 14 09:50:05 crc kubenswrapper[4886]: I0314 09:50:05.128498 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-m7zjx"] Mar 14 09:50:05 crc kubenswrapper[4886]: I0314 09:50:05.138276 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558024-m7zjx"] Mar 14 09:50:05 crc kubenswrapper[4886]: I0314 09:50:05.440917 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7334611-e628-4d27-b713-df373d7f6918" path="/var/lib/kubelet/pods/c7334611-e628-4d27-b713-df373d7f6918/volumes" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.065836 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.066437 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.122941 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kt9q9"] Mar 14 09:50:26 crc kubenswrapper[4886]: E0314 09:50:26.123526 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d25cee-7775-463f-b63f-3bf9e5b91e9b" containerName="oc" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.123543 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d25cee-7775-463f-b63f-3bf9e5b91e9b" containerName="oc" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.123743 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d25cee-7775-463f-b63f-3bf9e5b91e9b" containerName="oc" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.125173 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.145370 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kt9q9"] Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.187960 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqd56\" (UniqueName: \"kubernetes.io/projected/de18515b-8fda-40d2-8e69-bebfdb54ec39-kube-api-access-xqd56\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.188044 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-utilities\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.188629 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-catalog-content\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.290223 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-catalog-content\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.290285 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqd56\" (UniqueName: \"kubernetes.io/projected/de18515b-8fda-40d2-8e69-bebfdb54ec39-kube-api-access-xqd56\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.290364 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-utilities\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.291188 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-catalog-content\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.291224 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-utilities\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.307436 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqd56\" (UniqueName: \"kubernetes.io/projected/de18515b-8fda-40d2-8e69-bebfdb54ec39-kube-api-access-xqd56\") pod \"redhat-operators-kt9q9\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.469433 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:26 crc kubenswrapper[4886]: I0314 09:50:26.928802 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kt9q9"] Mar 14 09:50:27 crc kubenswrapper[4886]: I0314 09:50:27.867020 4886 generic.go:334] "Generic (PLEG): container finished" podID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerID="ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952" exitCode=0 Mar 14 09:50:27 crc kubenswrapper[4886]: I0314 09:50:27.867143 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerDied","Data":"ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952"} Mar 14 09:50:27 crc kubenswrapper[4886]: I0314 09:50:27.867309 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerStarted","Data":"f5322f1d8a2d50e784e680984ac871f7bf6e7258fd42abc24e5a99d8b8e0e279"} Mar 14 09:50:28 crc kubenswrapper[4886]: I0314 09:50:28.884037 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerStarted","Data":"209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea"} Mar 14 09:50:31 crc kubenswrapper[4886]: I0314 09:50:31.920000 4886 generic.go:334] "Generic (PLEG): container finished" podID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerID="209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea" exitCode=0 Mar 14 09:50:31 crc kubenswrapper[4886]: I0314 09:50:31.920044 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerDied","Data":"209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea"} Mar 14 09:50:32 crc kubenswrapper[4886]: I0314 09:50:32.942678 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerStarted","Data":"a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe"} Mar 14 09:50:32 crc kubenswrapper[4886]: I0314 09:50:32.962115 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kt9q9" podStartSLOduration=2.524862728 podStartE2EDuration="6.962089321s" podCreationTimestamp="2026-03-14 09:50:26 +0000 UTC" firstStartedPulling="2026-03-14 09:50:27.870396022 +0000 UTC m=+4963.118847659" lastFinishedPulling="2026-03-14 09:50:32.307622575 +0000 UTC m=+4967.556074252" observedRunningTime="2026-03-14 09:50:32.962017508 +0000 UTC m=+4968.210469155" watchObservedRunningTime="2026-03-14 09:50:32.962089321 +0000 UTC m=+4968.210540998" Mar 14 09:50:36 crc kubenswrapper[4886]: I0314 09:50:36.469856 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:36 crc kubenswrapper[4886]: I0314 09:50:36.470402 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:37 crc kubenswrapper[4886]: I0314 09:50:37.535678 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kt9q9" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="registry-server" probeResult="failure" output=< Mar 14 09:50:37 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 09:50:37 crc kubenswrapper[4886]: > Mar 14 09:50:37 crc kubenswrapper[4886]: I0314 09:50:37.870459 4886 scope.go:117] "RemoveContainer" containerID="b23ac0625d3d787e3a1eec3ab7236f8e7d96e6a1cdf4eee4275102b49b4c9bab" Mar 14 09:50:46 crc kubenswrapper[4886]: I0314 09:50:46.527813 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:46 crc kubenswrapper[4886]: I0314 09:50:46.611321 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:46 crc kubenswrapper[4886]: I0314 09:50:46.775079 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kt9q9"] Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.131316 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kt9q9" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="registry-server" containerID="cri-o://a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe" gracePeriod=2 Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.633562 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.808719 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqd56\" (UniqueName: \"kubernetes.io/projected/de18515b-8fda-40d2-8e69-bebfdb54ec39-kube-api-access-xqd56\") pod \"de18515b-8fda-40d2-8e69-bebfdb54ec39\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.808814 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-catalog-content\") pod \"de18515b-8fda-40d2-8e69-bebfdb54ec39\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.808932 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-utilities\") pod \"de18515b-8fda-40d2-8e69-bebfdb54ec39\" (UID: \"de18515b-8fda-40d2-8e69-bebfdb54ec39\") " Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.810029 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-utilities" (OuterVolumeSpecName: "utilities") pod "de18515b-8fda-40d2-8e69-bebfdb54ec39" (UID: "de18515b-8fda-40d2-8e69-bebfdb54ec39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.820076 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de18515b-8fda-40d2-8e69-bebfdb54ec39-kube-api-access-xqd56" (OuterVolumeSpecName: "kube-api-access-xqd56") pod "de18515b-8fda-40d2-8e69-bebfdb54ec39" (UID: "de18515b-8fda-40d2-8e69-bebfdb54ec39"). InnerVolumeSpecName "kube-api-access-xqd56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.911447 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.911490 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqd56\" (UniqueName: \"kubernetes.io/projected/de18515b-8fda-40d2-8e69-bebfdb54ec39-kube-api-access-xqd56\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:48 crc kubenswrapper[4886]: I0314 09:50:48.935736 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de18515b-8fda-40d2-8e69-bebfdb54ec39" (UID: "de18515b-8fda-40d2-8e69-bebfdb54ec39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.012705 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de18515b-8fda-40d2-8e69-bebfdb54ec39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.141597 4886 generic.go:334] "Generic (PLEG): container finished" podID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerID="a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe" exitCode=0 Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.141642 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerDied","Data":"a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe"} Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.141687 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9q9" event={"ID":"de18515b-8fda-40d2-8e69-bebfdb54ec39","Type":"ContainerDied","Data":"f5322f1d8a2d50e784e680984ac871f7bf6e7258fd42abc24e5a99d8b8e0e279"} Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.141687 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9q9" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.141729 4886 scope.go:117] "RemoveContainer" containerID="a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.178658 4886 scope.go:117] "RemoveContainer" containerID="209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.178818 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kt9q9"] Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.188992 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kt9q9"] Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.435372 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" path="/var/lib/kubelet/pods/de18515b-8fda-40d2-8e69-bebfdb54ec39/volumes" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.505686 4886 scope.go:117] "RemoveContainer" containerID="ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.541433 4886 scope.go:117] "RemoveContainer" containerID="a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe" Mar 14 09:50:49 crc kubenswrapper[4886]: E0314 09:50:49.541945 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe\": container with ID starting with a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe not found: ID does not exist" containerID="a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.542004 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe"} err="failed to get container status \"a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe\": rpc error: code = NotFound desc = could not find container \"a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe\": container with ID starting with a12c217549a149f8bf8cda9abe96abdb61fa3fbf2eb5ad8a5c10d159c5cf93fe not found: ID does not exist" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.542033 4886 scope.go:117] "RemoveContainer" containerID="209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea" Mar 14 09:50:49 crc kubenswrapper[4886]: E0314 09:50:49.542523 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea\": container with ID starting with 209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea not found: ID does not exist" containerID="209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.542567 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea"} err="failed to get container status \"209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea\": rpc error: code = NotFound desc = could not find container \"209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea\": container with ID starting with 209fbc3ca4d8e726d62fe1875f19ecfb40bfdfa959f6ab98fe1d56dfc8e4a4ea not found: ID does not exist" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.542583 4886 scope.go:117] "RemoveContainer" containerID="ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952" Mar 14 09:50:49 crc kubenswrapper[4886]: E0314 09:50:49.542894 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952\": container with ID starting with ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952 not found: ID does not exist" containerID="ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952" Mar 14 09:50:49 crc kubenswrapper[4886]: I0314 09:50:49.542926 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952"} err="failed to get container status \"ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952\": rpc error: code = NotFound desc = could not find container \"ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952\": container with ID starting with ed0952b24dc41551baec15c4135885f37078cd2b9be8e808f54c6dc34ecdf952 not found: ID does not exist" Mar 14 09:50:56 crc kubenswrapper[4886]: I0314 09:50:56.066835 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:50:56 crc kubenswrapper[4886]: I0314 09:50:56.067398 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:50:56 crc kubenswrapper[4886]: I0314 09:50:56.224390 4886 generic.go:334] "Generic (PLEG): container finished" podID="c4cedac0-b804-4ea3-b548-f2871b24d70a" containerID="7bee4e63bd472f1fe2446c03706d643787848d50a8b39038a6e614a5cf0639a6" exitCode=1 Mar 14 09:50:56 crc kubenswrapper[4886]: I0314 09:50:56.224768 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c4cedac0-b804-4ea3-b548-f2871b24d70a","Type":"ContainerDied","Data":"7bee4e63bd472f1fe2446c03706d643787848d50a8b39038a6e614a5cf0639a6"} Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.694581 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.885886 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886262 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ssh-key\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886378 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ca-certs\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886512 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config-secret\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886630 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-temporary\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886745 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wctpv\" (UniqueName: \"kubernetes.io/projected/c4cedac0-b804-4ea3-b548-f2871b24d70a-kube-api-access-wctpv\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886883 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-workdir\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.886986 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-config-data\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.887064 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.887249 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config\") pod \"c4cedac0-b804-4ea3-b548-f2871b24d70a\" (UID: \"c4cedac0-b804-4ea3-b548-f2871b24d70a\") " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.887905 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-config-data" (OuterVolumeSpecName: "config-data") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.888383 4886 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.891811 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4cedac0-b804-4ea3-b548-f2871b24d70a-kube-api-access-wctpv" (OuterVolumeSpecName: "kube-api-access-wctpv") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "kube-api-access-wctpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.904054 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.921539 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.921562 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.925730 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.939002 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.974253 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c4cedac0-b804-4ea3-b548-f2871b24d70a" (UID: "c4cedac0-b804-4ea3-b548-f2871b24d70a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.989932 4886 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c4cedac0-b804-4ea3-b548-f2871b24d70a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.989976 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.989991 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.990032 4886 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.990045 4886 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.990057 4886 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.990067 4886 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4cedac0-b804-4ea3-b548-f2871b24d70a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:57 crc kubenswrapper[4886]: I0314 09:50:57.990077 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wctpv\" (UniqueName: \"kubernetes.io/projected/c4cedac0-b804-4ea3-b548-f2871b24d70a-kube-api-access-wctpv\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:58 crc kubenswrapper[4886]: I0314 09:50:58.020252 4886 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 14 09:50:58 crc kubenswrapper[4886]: I0314 09:50:58.092372 4886 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 14 09:50:58 crc kubenswrapper[4886]: I0314 09:50:58.260983 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c4cedac0-b804-4ea3-b548-f2871b24d70a","Type":"ContainerDied","Data":"c650eabf9ea79bf5035b4105778bd01aa0ce219c69691ec3196ca4f7cb09f4c5"} Mar 14 09:50:58 crc kubenswrapper[4886]: I0314 09:50:58.261095 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c650eabf9ea79bf5035b4105778bd01aa0ce219c69691ec3196ca4f7cb09f4c5" Mar 14 09:50:58 crc kubenswrapper[4886]: I0314 09:50:58.261255 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.070228 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 09:51:03 crc kubenswrapper[4886]: E0314 09:51:03.071810 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cedac0-b804-4ea3-b548-f2871b24d70a" containerName="tempest-tests-tempest-tests-runner" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.071831 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cedac0-b804-4ea3-b548-f2871b24d70a" containerName="tempest-tests-tempest-tests-runner" Mar 14 09:51:03 crc kubenswrapper[4886]: E0314 09:51:03.071888 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="registry-server" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.071898 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="registry-server" Mar 14 09:51:03 crc kubenswrapper[4886]: E0314 09:51:03.071925 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="extract-utilities" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.071931 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="extract-utilities" Mar 14 09:51:03 crc kubenswrapper[4886]: E0314 09:51:03.071944 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="extract-content" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.071951 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="extract-content" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.072184 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="de18515b-8fda-40d2-8e69-bebfdb54ec39" containerName="registry-server" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.072207 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cedac0-b804-4ea3-b548-f2871b24d70a" containerName="tempest-tests-tempest-tests-runner" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.072882 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.083822 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-482rf" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.087785 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.196070 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxtws\" (UniqueName: \"kubernetes.io/projected/0395a385-992c-4531-b626-3f7ad78db060-kube-api-access-kxtws\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.196387 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.298665 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.298871 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxtws\" (UniqueName: \"kubernetes.io/projected/0395a385-992c-4531-b626-3f7ad78db060-kube-api-access-kxtws\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.299095 4886 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.332800 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxtws\" (UniqueName: \"kubernetes.io/projected/0395a385-992c-4531-b626-3f7ad78db060-kube-api-access-kxtws\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.348544 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0395a385-992c-4531-b626-3f7ad78db060\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.403959 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.941995 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 14 09:51:03 crc kubenswrapper[4886]: I0314 09:51:03.949705 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:51:04 crc kubenswrapper[4886]: I0314 09:51:04.318551 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0395a385-992c-4531-b626-3f7ad78db060","Type":"ContainerStarted","Data":"0ea7dc260f27e63a76459d6e4c77a918261faf723098ccd686a7e220addfa3b7"} Mar 14 09:51:05 crc kubenswrapper[4886]: I0314 09:51:05.343346 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0395a385-992c-4531-b626-3f7ad78db060","Type":"ContainerStarted","Data":"8a837e05834f0ed5def349398539c946e86bec9550564b560950d99b2bfcd19e"} Mar 14 09:51:05 crc kubenswrapper[4886]: I0314 09:51:05.363766 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.461380022 podStartE2EDuration="2.363749742s" podCreationTimestamp="2026-03-14 09:51:03 +0000 UTC" firstStartedPulling="2026-03-14 09:51:03.949525216 +0000 UTC m=+4999.197976853" lastFinishedPulling="2026-03-14 09:51:04.851894936 +0000 UTC m=+5000.100346573" observedRunningTime="2026-03-14 09:51:05.358901185 +0000 UTC m=+5000.607352822" watchObservedRunningTime="2026-03-14 09:51:05.363749742 +0000 UTC m=+5000.612201379" Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.065898 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.066507 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.066551 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.067280 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.067334 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" gracePeriod=600 Mar 14 09:51:26 crc kubenswrapper[4886]: E0314 09:51:26.206001 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.570808 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" exitCode=0 Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.570874 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f"} Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.571308 4886 scope.go:117] "RemoveContainer" containerID="7d0163d62d22d8130619611a4bc49f9ddca1c38d62e0385537e914e59680ef97" Mar 14 09:51:26 crc kubenswrapper[4886]: I0314 09:51:26.571967 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:51:26 crc kubenswrapper[4886]: E0314 09:51:26.572281 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.644321 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffgj/must-gather-znzwx"] Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.647135 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.649026 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tffgj"/"kube-root-ca.crt" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.649134 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tffgj"/"default-dockercfg-cfmtv" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.651179 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tffgj"/"openshift-service-ca.crt" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.664252 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffgj/must-gather-znzwx"] Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.752256 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5c8h\" (UniqueName: \"kubernetes.io/projected/6265b1bc-308a-49bf-9077-8d14626dc31a-kube-api-access-d5c8h\") pod \"must-gather-znzwx\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.752781 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6265b1bc-308a-49bf-9077-8d14626dc31a-must-gather-output\") pod \"must-gather-znzwx\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.855746 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6265b1bc-308a-49bf-9077-8d14626dc31a-must-gather-output\") pod \"must-gather-znzwx\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.856049 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c8h\" (UniqueName: \"kubernetes.io/projected/6265b1bc-308a-49bf-9077-8d14626dc31a-kube-api-access-d5c8h\") pod \"must-gather-znzwx\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.856257 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6265b1bc-308a-49bf-9077-8d14626dc31a-must-gather-output\") pod \"must-gather-znzwx\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.883419 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c8h\" (UniqueName: \"kubernetes.io/projected/6265b1bc-308a-49bf-9077-8d14626dc31a-kube-api-access-d5c8h\") pod \"must-gather-znzwx\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:33 crc kubenswrapper[4886]: I0314 09:51:33.965155 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:51:34 crc kubenswrapper[4886]: I0314 09:51:34.486844 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tffgj/must-gather-znzwx"] Mar 14 09:51:35 crc kubenswrapper[4886]: I0314 09:51:35.691316 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/must-gather-znzwx" event={"ID":"6265b1bc-308a-49bf-9077-8d14626dc31a","Type":"ContainerStarted","Data":"eedd9d1b820d2989e4b45d0f444f3f8e97888456cacb3290bb10fa5052b27d74"} Mar 14 09:51:38 crc kubenswrapper[4886]: I0314 09:51:38.425096 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:51:38 crc kubenswrapper[4886]: E0314 09:51:38.425583 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:51:41 crc kubenswrapper[4886]: I0314 09:51:41.761969 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/must-gather-znzwx" event={"ID":"6265b1bc-308a-49bf-9077-8d14626dc31a","Type":"ContainerStarted","Data":"5d08614f38f90ffef832cb2af337dc92add88e064291d5c574d3699fa4351c47"} Mar 14 09:51:41 crc kubenswrapper[4886]: I0314 09:51:41.762529 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/must-gather-znzwx" event={"ID":"6265b1bc-308a-49bf-9077-8d14626dc31a","Type":"ContainerStarted","Data":"dd2ceb5a1285b2c3176d96a3ab1ab3a68369c87a8d3b736449d70140e379e0d2"} Mar 14 09:51:41 crc kubenswrapper[4886]: I0314 09:51:41.779019 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffgj/must-gather-znzwx" podStartSLOduration=2.711645703 podStartE2EDuration="8.779001735s" podCreationTimestamp="2026-03-14 09:51:33 +0000 UTC" firstStartedPulling="2026-03-14 09:51:34.888661209 +0000 UTC m=+5030.137112886" lastFinishedPulling="2026-03-14 09:51:40.956017281 +0000 UTC m=+5036.204468918" observedRunningTime="2026-03-14 09:51:41.775509506 +0000 UTC m=+5037.023961153" watchObservedRunningTime="2026-03-14 09:51:41.779001735 +0000 UTC m=+5037.027453372" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.054609 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffgj/crc-debug-lrzng"] Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.057019 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.197331 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j599n\" (UniqueName: \"kubernetes.io/projected/676ef65a-33c2-4cdf-b13c-65e723828011-kube-api-access-j599n\") pod \"crc-debug-lrzng\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.197526 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676ef65a-33c2-4cdf-b13c-65e723828011-host\") pod \"crc-debug-lrzng\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.299159 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676ef65a-33c2-4cdf-b13c-65e723828011-host\") pod \"crc-debug-lrzng\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.299337 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j599n\" (UniqueName: \"kubernetes.io/projected/676ef65a-33c2-4cdf-b13c-65e723828011-kube-api-access-j599n\") pod \"crc-debug-lrzng\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.299454 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676ef65a-33c2-4cdf-b13c-65e723828011-host\") pod \"crc-debug-lrzng\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.780637 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j599n\" (UniqueName: \"kubernetes.io/projected/676ef65a-33c2-4cdf-b13c-65e723828011-kube-api-access-j599n\") pod \"crc-debug-lrzng\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:45 crc kubenswrapper[4886]: I0314 09:51:45.976259 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:51:46 crc kubenswrapper[4886]: W0314 09:51:46.056826 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676ef65a_33c2_4cdf_b13c_65e723828011.slice/crio-6a93f9f0aaede32b80a3932a619c0fffb1d4647761ab64218df95ab1606c2de2 WatchSource:0}: Error finding container 6a93f9f0aaede32b80a3932a619c0fffb1d4647761ab64218df95ab1606c2de2: Status 404 returned error can't find the container with id 6a93f9f0aaede32b80a3932a619c0fffb1d4647761ab64218df95ab1606c2de2 Mar 14 09:51:46 crc kubenswrapper[4886]: I0314 09:51:46.806755 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-lrzng" event={"ID":"676ef65a-33c2-4cdf-b13c-65e723828011","Type":"ContainerStarted","Data":"6a93f9f0aaede32b80a3932a619c0fffb1d4647761ab64218df95ab1606c2de2"} Mar 14 09:51:52 crc kubenswrapper[4886]: I0314 09:51:52.420746 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:51:52 crc kubenswrapper[4886]: E0314 09:51:52.422709 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:51:58 crc kubenswrapper[4886]: I0314 09:51:58.926271 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-lrzng" event={"ID":"676ef65a-33c2-4cdf-b13c-65e723828011","Type":"ContainerStarted","Data":"783fd0a1eeac25cf22018a8b750437d3a9371e7233b50896f12ea81e1baf1acd"} Mar 14 09:51:58 crc kubenswrapper[4886]: I0314 09:51:58.956385 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffgj/crc-debug-lrzng" podStartSLOduration=2.010985171 podStartE2EDuration="13.9563603s" podCreationTimestamp="2026-03-14 09:51:45 +0000 UTC" firstStartedPulling="2026-03-14 09:51:46.060661265 +0000 UTC m=+5041.309112902" lastFinishedPulling="2026-03-14 09:51:58.006036394 +0000 UTC m=+5053.254488031" observedRunningTime="2026-03-14 09:51:58.948044575 +0000 UTC m=+5054.196496212" watchObservedRunningTime="2026-03-14 09:51:58.9563603 +0000 UTC m=+5054.204811938" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.151609 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558032-mjzmw"] Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.153393 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.155438 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.155503 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.156391 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.168949 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-mjzmw"] Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.256102 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsx6s\" (UniqueName: \"kubernetes.io/projected/6a71a4e8-e737-4e94-aef8-477b1570338a-kube-api-access-vsx6s\") pod \"auto-csr-approver-29558032-mjzmw\" (UID: \"6a71a4e8-e737-4e94-aef8-477b1570338a\") " pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.358360 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsx6s\" (UniqueName: \"kubernetes.io/projected/6a71a4e8-e737-4e94-aef8-477b1570338a-kube-api-access-vsx6s\") pod \"auto-csr-approver-29558032-mjzmw\" (UID: \"6a71a4e8-e737-4e94-aef8-477b1570338a\") " pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.376591 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsx6s\" (UniqueName: \"kubernetes.io/projected/6a71a4e8-e737-4e94-aef8-477b1570338a-kube-api-access-vsx6s\") pod \"auto-csr-approver-29558032-mjzmw\" (UID: \"6a71a4e8-e737-4e94-aef8-477b1570338a\") " pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:00 crc kubenswrapper[4886]: I0314 09:52:00.471368 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:01 crc kubenswrapper[4886]: I0314 09:52:01.605890 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-mjzmw"] Mar 14 09:52:01 crc kubenswrapper[4886]: I0314 09:52:01.949641 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" event={"ID":"6a71a4e8-e737-4e94-aef8-477b1570338a","Type":"ContainerStarted","Data":"80f86af40fc46bd009a371d9633a1b768074096c2a61efa1b43142cb27f30290"} Mar 14 09:52:02 crc kubenswrapper[4886]: I0314 09:52:02.959434 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" event={"ID":"6a71a4e8-e737-4e94-aef8-477b1570338a","Type":"ContainerStarted","Data":"1eb06df1507e9613b384ddf0af995416914ce4ba87044c1bb15066f8eebb46ce"} Mar 14 09:52:02 crc kubenswrapper[4886]: I0314 09:52:02.983619 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" podStartSLOduration=2.175228823 podStartE2EDuration="2.983598405s" podCreationTimestamp="2026-03-14 09:52:00 +0000 UTC" firstStartedPulling="2026-03-14 09:52:01.603234117 +0000 UTC m=+5056.851685754" lastFinishedPulling="2026-03-14 09:52:02.411603699 +0000 UTC m=+5057.660055336" observedRunningTime="2026-03-14 09:52:02.973784058 +0000 UTC m=+5058.222235695" watchObservedRunningTime="2026-03-14 09:52:02.983598405 +0000 UTC m=+5058.232050042" Mar 14 09:52:03 crc kubenswrapper[4886]: I0314 09:52:03.423022 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:52:03 crc kubenswrapper[4886]: E0314 09:52:03.423276 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:52:03 crc kubenswrapper[4886]: I0314 09:52:03.969749 4886 generic.go:334] "Generic (PLEG): container finished" podID="6a71a4e8-e737-4e94-aef8-477b1570338a" containerID="1eb06df1507e9613b384ddf0af995416914ce4ba87044c1bb15066f8eebb46ce" exitCode=0 Mar 14 09:52:03 crc kubenswrapper[4886]: I0314 09:52:03.969855 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" event={"ID":"6a71a4e8-e737-4e94-aef8-477b1570338a","Type":"ContainerDied","Data":"1eb06df1507e9613b384ddf0af995416914ce4ba87044c1bb15066f8eebb46ce"} Mar 14 09:52:05 crc kubenswrapper[4886]: I0314 09:52:05.945551 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:05 crc kubenswrapper[4886]: I0314 09:52:05.987924 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" event={"ID":"6a71a4e8-e737-4e94-aef8-477b1570338a","Type":"ContainerDied","Data":"80f86af40fc46bd009a371d9633a1b768074096c2a61efa1b43142cb27f30290"} Mar 14 09:52:05 crc kubenswrapper[4886]: I0314 09:52:05.987978 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80f86af40fc46bd009a371d9633a1b768074096c2a61efa1b43142cb27f30290" Mar 14 09:52:05 crc kubenswrapper[4886]: I0314 09:52:05.987995 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558032-mjzmw" Mar 14 09:52:06 crc kubenswrapper[4886]: I0314 09:52:06.070612 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-zkkts"] Mar 14 09:52:06 crc kubenswrapper[4886]: I0314 09:52:06.078722 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsx6s\" (UniqueName: \"kubernetes.io/projected/6a71a4e8-e737-4e94-aef8-477b1570338a-kube-api-access-vsx6s\") pod \"6a71a4e8-e737-4e94-aef8-477b1570338a\" (UID: \"6a71a4e8-e737-4e94-aef8-477b1570338a\") " Mar 14 09:52:06 crc kubenswrapper[4886]: I0314 09:52:06.079997 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558026-zkkts"] Mar 14 09:52:06 crc kubenswrapper[4886]: I0314 09:52:06.093373 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a71a4e8-e737-4e94-aef8-477b1570338a-kube-api-access-vsx6s" (OuterVolumeSpecName: "kube-api-access-vsx6s") pod "6a71a4e8-e737-4e94-aef8-477b1570338a" (UID: "6a71a4e8-e737-4e94-aef8-477b1570338a"). InnerVolumeSpecName "kube-api-access-vsx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:06 crc kubenswrapper[4886]: I0314 09:52:06.182011 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsx6s\" (UniqueName: \"kubernetes.io/projected/6a71a4e8-e737-4e94-aef8-477b1570338a-kube-api-access-vsx6s\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:07 crc kubenswrapper[4886]: I0314 09:52:07.432232 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742dc010-adf1-4c5a-ade4-dd961148b6b7" path="/var/lib/kubelet/pods/742dc010-adf1-4c5a-ade4-dd961148b6b7/volumes" Mar 14 09:52:15 crc kubenswrapper[4886]: I0314 09:52:15.427349 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:52:15 crc kubenswrapper[4886]: E0314 09:52:15.428165 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.566711 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-578pr"] Mar 14 09:52:20 crc kubenswrapper[4886]: E0314 09:52:20.567625 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a71a4e8-e737-4e94-aef8-477b1570338a" containerName="oc" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.567637 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a71a4e8-e737-4e94-aef8-477b1570338a" containerName="oc" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.567856 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a71a4e8-e737-4e94-aef8-477b1570338a" containerName="oc" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.569480 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.587348 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-578pr"] Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.690841 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-utilities\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.690908 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ls9\" (UniqueName: \"kubernetes.io/projected/5760d866-c45d-4c76-baab-d46409a42ff3-kube-api-access-c6ls9\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.691066 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-catalog-content\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.792768 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-catalog-content\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.792850 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-utilities\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.792884 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ls9\" (UniqueName: \"kubernetes.io/projected/5760d866-c45d-4c76-baab-d46409a42ff3-kube-api-access-c6ls9\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.793414 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-catalog-content\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.793573 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-utilities\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:20 crc kubenswrapper[4886]: I0314 09:52:20.989269 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ls9\" (UniqueName: \"kubernetes.io/projected/5760d866-c45d-4c76-baab-d46409a42ff3-kube-api-access-c6ls9\") pod \"community-operators-578pr\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:21 crc kubenswrapper[4886]: I0314 09:52:21.200882 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:21 crc kubenswrapper[4886]: I0314 09:52:21.766783 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-578pr"] Mar 14 09:52:22 crc kubenswrapper[4886]: I0314 09:52:22.140338 4886 generic.go:334] "Generic (PLEG): container finished" podID="5760d866-c45d-4c76-baab-d46409a42ff3" containerID="274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118" exitCode=0 Mar 14 09:52:22 crc kubenswrapper[4886]: I0314 09:52:22.140439 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerDied","Data":"274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118"} Mar 14 09:52:22 crc kubenswrapper[4886]: I0314 09:52:22.140721 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerStarted","Data":"59286c104891e20349f19b6d4c0e01dcf6060055b689a0455f6c623e9eda2dba"} Mar 14 09:52:23 crc kubenswrapper[4886]: I0314 09:52:23.149703 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerStarted","Data":"e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a"} Mar 14 09:52:25 crc kubenswrapper[4886]: I0314 09:52:25.166774 4886 generic.go:334] "Generic (PLEG): container finished" podID="5760d866-c45d-4c76-baab-d46409a42ff3" containerID="e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a" exitCode=0 Mar 14 09:52:25 crc kubenswrapper[4886]: I0314 09:52:25.166938 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerDied","Data":"e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a"} Mar 14 09:52:26 crc kubenswrapper[4886]: I0314 09:52:26.191014 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerStarted","Data":"87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0"} Mar 14 09:52:26 crc kubenswrapper[4886]: I0314 09:52:26.217676 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-578pr" podStartSLOduration=2.740648985 podStartE2EDuration="6.217657339s" podCreationTimestamp="2026-03-14 09:52:20 +0000 UTC" firstStartedPulling="2026-03-14 09:52:22.141961124 +0000 UTC m=+5077.390412781" lastFinishedPulling="2026-03-14 09:52:25.618969498 +0000 UTC m=+5080.867421135" observedRunningTime="2026-03-14 09:52:26.208688106 +0000 UTC m=+5081.457139743" watchObservedRunningTime="2026-03-14 09:52:26.217657339 +0000 UTC m=+5081.466108976" Mar 14 09:52:29 crc kubenswrapper[4886]: I0314 09:52:29.421394 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:52:29 crc kubenswrapper[4886]: E0314 09:52:29.422249 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:52:31 crc kubenswrapper[4886]: I0314 09:52:31.201952 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:31 crc kubenswrapper[4886]: I0314 09:52:31.202320 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:31 crc kubenswrapper[4886]: I0314 09:52:31.251879 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:31 crc kubenswrapper[4886]: I0314 09:52:31.317476 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:31 crc kubenswrapper[4886]: I0314 09:52:31.492976 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-578pr"] Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.254234 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-578pr" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="registry-server" containerID="cri-o://87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0" gracePeriod=2 Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.716092 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.882236 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-catalog-content\") pod \"5760d866-c45d-4c76-baab-d46409a42ff3\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.882422 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ls9\" (UniqueName: \"kubernetes.io/projected/5760d866-c45d-4c76-baab-d46409a42ff3-kube-api-access-c6ls9\") pod \"5760d866-c45d-4c76-baab-d46409a42ff3\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.882561 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-utilities\") pod \"5760d866-c45d-4c76-baab-d46409a42ff3\" (UID: \"5760d866-c45d-4c76-baab-d46409a42ff3\") " Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.883297 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-utilities" (OuterVolumeSpecName: "utilities") pod "5760d866-c45d-4c76-baab-d46409a42ff3" (UID: "5760d866-c45d-4c76-baab-d46409a42ff3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.889355 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5760d866-c45d-4c76-baab-d46409a42ff3-kube-api-access-c6ls9" (OuterVolumeSpecName: "kube-api-access-c6ls9") pod "5760d866-c45d-4c76-baab-d46409a42ff3" (UID: "5760d866-c45d-4c76-baab-d46409a42ff3"). InnerVolumeSpecName "kube-api-access-c6ls9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.950779 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5760d866-c45d-4c76-baab-d46409a42ff3" (UID: "5760d866-c45d-4c76-baab-d46409a42ff3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.985356 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.985396 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760d866-c45d-4c76-baab-d46409a42ff3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:33 crc kubenswrapper[4886]: I0314 09:52:33.985408 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ls9\" (UniqueName: \"kubernetes.io/projected/5760d866-c45d-4c76-baab-d46409a42ff3-kube-api-access-c6ls9\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.269014 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578pr" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.269152 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerDied","Data":"87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0"} Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.269220 4886 scope.go:117] "RemoveContainer" containerID="87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.282221 4886 generic.go:334] "Generic (PLEG): container finished" podID="5760d866-c45d-4c76-baab-d46409a42ff3" containerID="87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0" exitCode=0 Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.282273 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578pr" event={"ID":"5760d866-c45d-4c76-baab-d46409a42ff3","Type":"ContainerDied","Data":"59286c104891e20349f19b6d4c0e01dcf6060055b689a0455f6c623e9eda2dba"} Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.307964 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-578pr"] Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.319873 4886 scope.go:117] "RemoveContainer" containerID="e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.320161 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-578pr"] Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.344245 4886 scope.go:117] "RemoveContainer" containerID="274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.382938 4886 scope.go:117] "RemoveContainer" containerID="87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0" Mar 14 09:52:34 crc kubenswrapper[4886]: E0314 09:52:34.383755 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0\": container with ID starting with 87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0 not found: ID does not exist" containerID="87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.383807 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0"} err="failed to get container status \"87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0\": rpc error: code = NotFound desc = could not find container \"87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0\": container with ID starting with 87686c6a76cf5e71d960f994824a68bb6f3a5a866765903fcc40911c7601bca0 not found: ID does not exist" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.383832 4886 scope.go:117] "RemoveContainer" containerID="e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a" Mar 14 09:52:34 crc kubenswrapper[4886]: E0314 09:52:34.384090 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a\": container with ID starting with e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a not found: ID does not exist" containerID="e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.384143 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a"} err="failed to get container status \"e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a\": rpc error: code = NotFound desc = could not find container \"e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a\": container with ID starting with e2783825ea584c47cf81590d5d94748dd007458e66f47e8a3a82736c0272ee3a not found: ID does not exist" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.384156 4886 scope.go:117] "RemoveContainer" containerID="274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118" Mar 14 09:52:34 crc kubenswrapper[4886]: E0314 09:52:34.384574 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118\": container with ID starting with 274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118 not found: ID does not exist" containerID="274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118" Mar 14 09:52:34 crc kubenswrapper[4886]: I0314 09:52:34.384625 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118"} err="failed to get container status \"274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118\": rpc error: code = NotFound desc = could not find container \"274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118\": container with ID starting with 274aaaeb35b2c349392c76f9b1057bf0d7792cfc28bd7dd909f6a777e6616118 not found: ID does not exist" Mar 14 09:52:35 crc kubenswrapper[4886]: I0314 09:52:35.442704 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" path="/var/lib/kubelet/pods/5760d866-c45d-4c76-baab-d46409a42ff3/volumes" Mar 14 09:52:40 crc kubenswrapper[4886]: I0314 09:52:40.421975 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:52:40 crc kubenswrapper[4886]: E0314 09:52:40.424453 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:52:40 crc kubenswrapper[4886]: I0314 09:52:40.890578 4886 scope.go:117] "RemoveContainer" containerID="d31c5bf949c0cf07947cf473a65c5ebb6e510151260cf0439cbec88718175683" Mar 14 09:52:46 crc kubenswrapper[4886]: I0314 09:52:46.451926 4886 generic.go:334] "Generic (PLEG): container finished" podID="676ef65a-33c2-4cdf-b13c-65e723828011" containerID="783fd0a1eeac25cf22018a8b750437d3a9371e7233b50896f12ea81e1baf1acd" exitCode=0 Mar 14 09:52:46 crc kubenswrapper[4886]: I0314 09:52:46.452032 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-lrzng" event={"ID":"676ef65a-33c2-4cdf-b13c-65e723828011","Type":"ContainerDied","Data":"783fd0a1eeac25cf22018a8b750437d3a9371e7233b50896f12ea81e1baf1acd"} Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.576341 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.627413 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tffgj/crc-debug-lrzng"] Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.639637 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tffgj/crc-debug-lrzng"] Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.718228 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j599n\" (UniqueName: \"kubernetes.io/projected/676ef65a-33c2-4cdf-b13c-65e723828011-kube-api-access-j599n\") pod \"676ef65a-33c2-4cdf-b13c-65e723828011\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.718344 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676ef65a-33c2-4cdf-b13c-65e723828011-host\") pod \"676ef65a-33c2-4cdf-b13c-65e723828011\" (UID: \"676ef65a-33c2-4cdf-b13c-65e723828011\") " Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.718474 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/676ef65a-33c2-4cdf-b13c-65e723828011-host" (OuterVolumeSpecName: "host") pod "676ef65a-33c2-4cdf-b13c-65e723828011" (UID: "676ef65a-33c2-4cdf-b13c-65e723828011"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.718794 4886 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/676ef65a-33c2-4cdf-b13c-65e723828011-host\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.725261 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676ef65a-33c2-4cdf-b13c-65e723828011-kube-api-access-j599n" (OuterVolumeSpecName: "kube-api-access-j599n") pod "676ef65a-33c2-4cdf-b13c-65e723828011" (UID: "676ef65a-33c2-4cdf-b13c-65e723828011"). InnerVolumeSpecName "kube-api-access-j599n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:47 crc kubenswrapper[4886]: I0314 09:52:47.821912 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j599n\" (UniqueName: \"kubernetes.io/projected/676ef65a-33c2-4cdf-b13c-65e723828011-kube-api-access-j599n\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.474076 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a93f9f0aaede32b80a3932a619c0fffb1d4647761ab64218df95ab1606c2de2" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.474213 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-lrzng" Mar 14 09:52:48 crc kubenswrapper[4886]: E0314 09:52:48.681966 4886 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676ef65a_33c2_4cdf_b13c_65e723828011.slice/crio-6a93f9f0aaede32b80a3932a619c0fffb1d4647761ab64218df95ab1606c2de2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676ef65a_33c2_4cdf_b13c_65e723828011.slice\": RecentStats: unable to find data in memory cache]" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.785576 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffgj/crc-debug-gfg8k"] Mar 14 09:52:48 crc kubenswrapper[4886]: E0314 09:52:48.786579 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="registry-server" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.786600 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="registry-server" Mar 14 09:52:48 crc kubenswrapper[4886]: E0314 09:52:48.786614 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676ef65a-33c2-4cdf-b13c-65e723828011" containerName="container-00" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.786629 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="676ef65a-33c2-4cdf-b13c-65e723828011" containerName="container-00" Mar 14 09:52:48 crc kubenswrapper[4886]: E0314 09:52:48.786649 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="extract-content" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.786657 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="extract-content" Mar 14 09:52:48 crc kubenswrapper[4886]: E0314 09:52:48.786668 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="extract-utilities" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.786697 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="extract-utilities" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.786910 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="676ef65a-33c2-4cdf-b13c-65e723828011" containerName="container-00" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.786940 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="5760d866-c45d-4c76-baab-d46409a42ff3" containerName="registry-server" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.787593 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.947484 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fklzm\" (UniqueName: \"kubernetes.io/projected/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-kube-api-access-fklzm\") pod \"crc-debug-gfg8k\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:48 crc kubenswrapper[4886]: I0314 09:52:48.947535 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-host\") pod \"crc-debug-gfg8k\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.049949 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fklzm\" (UniqueName: \"kubernetes.io/projected/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-kube-api-access-fklzm\") pod \"crc-debug-gfg8k\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.050063 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-host\") pod \"crc-debug-gfg8k\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.050257 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-host\") pod \"crc-debug-gfg8k\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.071778 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fklzm\" (UniqueName: \"kubernetes.io/projected/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-kube-api-access-fklzm\") pod \"crc-debug-gfg8k\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.111735 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:49 crc kubenswrapper[4886]: W0314 09:52:49.143369 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ed0966f_6fb1_4849_bbf8_9e2f6894e332.slice/crio-2d5911ffba38d28c610093b66b21656494c142fc82ab9ed00d1fffa12cc095c8 WatchSource:0}: Error finding container 2d5911ffba38d28c610093b66b21656494c142fc82ab9ed00d1fffa12cc095c8: Status 404 returned error can't find the container with id 2d5911ffba38d28c610093b66b21656494c142fc82ab9ed00d1fffa12cc095c8 Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.431209 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676ef65a-33c2-4cdf-b13c-65e723828011" path="/var/lib/kubelet/pods/676ef65a-33c2-4cdf-b13c-65e723828011/volumes" Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.483587 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" event={"ID":"4ed0966f-6fb1-4849-bbf8-9e2f6894e332","Type":"ContainerStarted","Data":"2dd1691d68a262e871ca81073c2279c75e3b164cb7a5e1a3d2131dc093a63a58"} Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.483631 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" event={"ID":"4ed0966f-6fb1-4849-bbf8-9e2f6894e332","Type":"ContainerStarted","Data":"2d5911ffba38d28c610093b66b21656494c142fc82ab9ed00d1fffa12cc095c8"} Mar 14 09:52:49 crc kubenswrapper[4886]: I0314 09:52:49.508031 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" podStartSLOduration=1.5080106770000001 podStartE2EDuration="1.508010677s" podCreationTimestamp="2026-03-14 09:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 09:52:49.498403185 +0000 UTC m=+5104.746854822" watchObservedRunningTime="2026-03-14 09:52:49.508010677 +0000 UTC m=+5104.756462314" Mar 14 09:52:50 crc kubenswrapper[4886]: I0314 09:52:50.494829 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" event={"ID":"4ed0966f-6fb1-4849-bbf8-9e2f6894e332","Type":"ContainerDied","Data":"2dd1691d68a262e871ca81073c2279c75e3b164cb7a5e1a3d2131dc093a63a58"} Mar 14 09:52:50 crc kubenswrapper[4886]: I0314 09:52:50.495389 4886 generic.go:334] "Generic (PLEG): container finished" podID="4ed0966f-6fb1-4849-bbf8-9e2f6894e332" containerID="2dd1691d68a262e871ca81073c2279c75e3b164cb7a5e1a3d2131dc093a63a58" exitCode=0 Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.621311 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.696201 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-host\") pod \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.696410 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fklzm\" (UniqueName: \"kubernetes.io/projected/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-kube-api-access-fklzm\") pod \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\" (UID: \"4ed0966f-6fb1-4849-bbf8-9e2f6894e332\") " Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.698234 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-host" (OuterVolumeSpecName: "host") pod "4ed0966f-6fb1-4849-bbf8-9e2f6894e332" (UID: "4ed0966f-6fb1-4849-bbf8-9e2f6894e332"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.704422 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-kube-api-access-fklzm" (OuterVolumeSpecName: "kube-api-access-fklzm") pod "4ed0966f-6fb1-4849-bbf8-9e2f6894e332" (UID: "4ed0966f-6fb1-4849-bbf8-9e2f6894e332"). InnerVolumeSpecName "kube-api-access-fklzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.799301 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fklzm\" (UniqueName: \"kubernetes.io/projected/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-kube-api-access-fklzm\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.799553 4886 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed0966f-6fb1-4849-bbf8-9e2f6894e332-host\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.927909 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tffgj/crc-debug-gfg8k"] Mar 14 09:52:51 crc kubenswrapper[4886]: I0314 09:52:51.935013 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tffgj/crc-debug-gfg8k"] Mar 14 09:52:52 crc kubenswrapper[4886]: I0314 09:52:52.420550 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:52:52 crc kubenswrapper[4886]: E0314 09:52:52.420805 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:52:52 crc kubenswrapper[4886]: I0314 09:52:52.523385 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d5911ffba38d28c610093b66b21656494c142fc82ab9ed00d1fffa12cc095c8" Mar 14 09:52:52 crc kubenswrapper[4886]: I0314 09:52:52.523450 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-gfg8k" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.096528 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tffgj/crc-debug-6lm4g"] Mar 14 09:52:53 crc kubenswrapper[4886]: E0314 09:52:53.097654 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed0966f-6fb1-4849-bbf8-9e2f6894e332" containerName="container-00" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.097725 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed0966f-6fb1-4849-bbf8-9e2f6894e332" containerName="container-00" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.097994 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed0966f-6fb1-4849-bbf8-9e2f6894e332" containerName="container-00" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.098752 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.226159 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462975bd-91d3-4454-b458-5f3c62842683-host\") pod \"crc-debug-6lm4g\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.226374 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlngx\" (UniqueName: \"kubernetes.io/projected/462975bd-91d3-4454-b458-5f3c62842683-kube-api-access-vlngx\") pod \"crc-debug-6lm4g\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.328142 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462975bd-91d3-4454-b458-5f3c62842683-host\") pod \"crc-debug-6lm4g\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.328287 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462975bd-91d3-4454-b458-5f3c62842683-host\") pod \"crc-debug-6lm4g\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.328561 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlngx\" (UniqueName: \"kubernetes.io/projected/462975bd-91d3-4454-b458-5f3c62842683-kube-api-access-vlngx\") pod \"crc-debug-6lm4g\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.349196 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlngx\" (UniqueName: \"kubernetes.io/projected/462975bd-91d3-4454-b458-5f3c62842683-kube-api-access-vlngx\") pod \"crc-debug-6lm4g\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.423780 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.433159 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed0966f-6fb1-4849-bbf8-9e2f6894e332" path="/var/lib/kubelet/pods/4ed0966f-6fb1-4849-bbf8-9e2f6894e332/volumes" Mar 14 09:52:53 crc kubenswrapper[4886]: W0314 09:52:53.453139 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod462975bd_91d3_4454_b458_5f3c62842683.slice/crio-d735451d87629ee61aba3f19fb87def8e68b1f7bbae5680373f7769da8f3a50f WatchSource:0}: Error finding container d735451d87629ee61aba3f19fb87def8e68b1f7bbae5680373f7769da8f3a50f: Status 404 returned error can't find the container with id d735451d87629ee61aba3f19fb87def8e68b1f7bbae5680373f7769da8f3a50f Mar 14 09:52:53 crc kubenswrapper[4886]: I0314 09:52:53.539394 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-6lm4g" event={"ID":"462975bd-91d3-4454-b458-5f3c62842683","Type":"ContainerStarted","Data":"d735451d87629ee61aba3f19fb87def8e68b1f7bbae5680373f7769da8f3a50f"} Mar 14 09:52:54 crc kubenswrapper[4886]: I0314 09:52:54.556413 4886 generic.go:334] "Generic (PLEG): container finished" podID="462975bd-91d3-4454-b458-5f3c62842683" containerID="74b1f017983c079fa3a12503a435dc33c05a811c30eae4cd1902cb4eb9c994d0" exitCode=0 Mar 14 09:52:54 crc kubenswrapper[4886]: I0314 09:52:54.556478 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/crc-debug-6lm4g" event={"ID":"462975bd-91d3-4454-b458-5f3c62842683","Type":"ContainerDied","Data":"74b1f017983c079fa3a12503a435dc33c05a811c30eae4cd1902cb4eb9c994d0"} Mar 14 09:52:54 crc kubenswrapper[4886]: I0314 09:52:54.614609 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tffgj/crc-debug-6lm4g"] Mar 14 09:52:54 crc kubenswrapper[4886]: I0314 09:52:54.624895 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tffgj/crc-debug-6lm4g"] Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.685816 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.784502 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462975bd-91d3-4454-b458-5f3c62842683-host\") pod \"462975bd-91d3-4454-b458-5f3c62842683\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.784587 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlngx\" (UniqueName: \"kubernetes.io/projected/462975bd-91d3-4454-b458-5f3c62842683-kube-api-access-vlngx\") pod \"462975bd-91d3-4454-b458-5f3c62842683\" (UID: \"462975bd-91d3-4454-b458-5f3c62842683\") " Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.784885 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/462975bd-91d3-4454-b458-5f3c62842683-host" (OuterVolumeSpecName: "host") pod "462975bd-91d3-4454-b458-5f3c62842683" (UID: "462975bd-91d3-4454-b458-5f3c62842683"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.785286 4886 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/462975bd-91d3-4454-b458-5f3c62842683-host\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.790400 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462975bd-91d3-4454-b458-5f3c62842683-kube-api-access-vlngx" (OuterVolumeSpecName: "kube-api-access-vlngx") pod "462975bd-91d3-4454-b458-5f3c62842683" (UID: "462975bd-91d3-4454-b458-5f3c62842683"). InnerVolumeSpecName "kube-api-access-vlngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:52:55 crc kubenswrapper[4886]: I0314 09:52:55.887144 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlngx\" (UniqueName: \"kubernetes.io/projected/462975bd-91d3-4454-b458-5f3c62842683-kube-api-access-vlngx\") on node \"crc\" DevicePath \"\"" Mar 14 09:52:56 crc kubenswrapper[4886]: I0314 09:52:56.582252 4886 scope.go:117] "RemoveContainer" containerID="74b1f017983c079fa3a12503a435dc33c05a811c30eae4cd1902cb4eb9c994d0" Mar 14 09:52:56 crc kubenswrapper[4886]: I0314 09:52:56.582312 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/crc-debug-6lm4g" Mar 14 09:52:57 crc kubenswrapper[4886]: I0314 09:52:57.433361 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462975bd-91d3-4454-b458-5f3c62842683" path="/var/lib/kubelet/pods/462975bd-91d3-4454-b458-5f3c62842683/volumes" Mar 14 09:53:07 crc kubenswrapper[4886]: I0314 09:53:07.422564 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:53:07 crc kubenswrapper[4886]: E0314 09:53:07.423144 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:53:19 crc kubenswrapper[4886]: I0314 09:53:19.422271 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:53:19 crc kubenswrapper[4886]: E0314 09:53:19.424268 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.049217 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-dc48c5bd6-xmnxc_309604ae-1d2f-4f0a-9fa3-1960efc340b6/barbican-api/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.282572 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-dc48c5bd6-xmnxc_309604ae-1d2f-4f0a-9fa3-1960efc340b6/barbican-api-log/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.313288 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-756657585d-2x84b_46e268aa-326d-42d7-936d-3e4d120dfeb6/barbican-keystone-listener/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.354924 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-756657585d-2x84b_46e268aa-326d-42d7-936d-3e4d120dfeb6/barbican-keystone-listener-log/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.505785 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8557ccd47-58ztp_42ce024b-4e1d-4f45-9faa-5f637e5a8466/barbican-worker-log/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.542099 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8557ccd47-58ztp_42ce024b-4e1d-4f45-9faa-5f637e5a8466/barbican-worker/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.699773 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fhrj9_bbd0a941-8eab-4742-9002-b42381f0d326/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.768019 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e173618-1706-4fef-937c-a6e2a1d5eb30/ceilometer-central-agent/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.874076 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e173618-1706-4fef-937c-a6e2a1d5eb30/ceilometer-notification-agent/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.967030 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e173618-1706-4fef-937c-a6e2a1d5eb30/proxy-httpd/0.log" Mar 14 09:53:26 crc kubenswrapper[4886]: I0314 09:53:26.984961 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e173618-1706-4fef-937c-a6e2a1d5eb30/sg-core/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.174142 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0496ade0-0884-4a88-a226-5145b6396213/cinder-api/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.192420 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0496ade0-0884-4a88-a226-5145b6396213/cinder-api-log/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.335163 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b1d37af9-6a33-4358-9c0c-258cb011d3e4/cinder-scheduler/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.416567 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b1d37af9-6a33-4358-9c0c-258cb011d3e4/probe/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.510046 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7bmm4_e26f4f17-b548-45cf-8781-058e7b1787d0/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.611980 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5bggv_b138089b-be7e-480e-8c8f-37104053a419/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.722106 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-56f7ccd8f7-hk99n_dc84b7fd-15aa-4477-9145-f97680b55a4b/init/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.885061 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-56f7ccd8f7-hk99n_dc84b7fd-15aa-4477-9145-f97680b55a4b/init/0.log" Mar 14 09:53:27 crc kubenswrapper[4886]: I0314 09:53:27.955618 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-hwcqv_d6cbe588-9aee-4554-b985-c809186e86d9/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.042089 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-56f7ccd8f7-hk99n_dc84b7fd-15aa-4477-9145-f97680b55a4b/dnsmasq-dns/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.171363 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_90c0e7b8-3991-44c6-b013-f55286cc08ff/glance-httpd/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.231909 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_90c0e7b8-3991-44c6-b013-f55286cc08ff/glance-log/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.332219 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d9e64d41-fe1b-453e-807e-b3e94a62a804/glance-httpd/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.386683 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d9e64d41-fe1b-453e-807e-b3e94a62a804/glance-log/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.587234 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7769c88f5b-8gr9x_46272ed5-a9f5-45eb-b9ba-58289ed822a7/horizon/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.792704 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-xv4zm_49b11e76-2a25-43aa-a8ff-a383088da9b5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:28 crc kubenswrapper[4886]: I0314 09:53:28.956200 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rv55x_58ff1f53-8347-4a4f-9892-a8ba1d8822af/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:29 crc kubenswrapper[4886]: I0314 09:53:29.242364 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7769c88f5b-8gr9x_46272ed5-a9f5-45eb-b9ba-58289ed822a7/horizon-log/0.log" Mar 14 09:53:29 crc kubenswrapper[4886]: I0314 09:53:29.364789 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29557981-w64rr_942975ae-1af3-4e59-b73f-7cd6246a5f7e/keystone-cron/0.log" Mar 14 09:53:29 crc kubenswrapper[4886]: I0314 09:53:29.545605 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d5d4f7d47-v8h75_18c3fb7c-c71e-4c67-96a2-6e9455e67182/keystone-api/0.log" Mar 14 09:53:30 crc kubenswrapper[4886]: I0314 09:53:30.134793 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9edceed1-9561-4562-bdf8-1c2ff655a920/kube-state-metrics/0.log" Mar 14 09:53:30 crc kubenswrapper[4886]: I0314 09:53:30.210635 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cn426_abe350ea-5335-4b70-8de1-b33c2c17c876/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:30 crc kubenswrapper[4886]: I0314 09:53:30.678176 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5694fd5cb9-r8nzz_5055b057-f745-43b5-8bd2-937ed8b29743/neutron-httpd/0.log" Mar 14 09:53:30 crc kubenswrapper[4886]: I0314 09:53:30.700371 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-g5mlx_ca30d3c2-97e8-4ade-b4c8-b737a405c62f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:30 crc kubenswrapper[4886]: I0314 09:53:30.749582 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5694fd5cb9-r8nzz_5055b057-f745-43b5-8bd2-937ed8b29743/neutron-api/0.log" Mar 14 09:53:31 crc kubenswrapper[4886]: I0314 09:53:31.305402 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c8d09abe-fc85-42fe-ac82-95af478b8985/nova-cell0-conductor-conductor/0.log" Mar 14 09:53:32 crc kubenswrapper[4886]: I0314 09:53:32.120484 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4d9f64aa-3e9d-422f-a81e-a22d00914728/nova-api-log/0.log" Mar 14 09:53:32 crc kubenswrapper[4886]: I0314 09:53:32.233187 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_28887cc9-3fde-46ad-b166-92d820ad7689/nova-cell1-conductor-conductor/0.log" Mar 14 09:53:32 crc kubenswrapper[4886]: I0314 09:53:32.420030 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e3ede667-4a5f-4d40-9ad4-ab5e4678bf78/nova-cell1-novncproxy-novncproxy/0.log" Mar 14 09:53:32 crc kubenswrapper[4886]: I0314 09:53:32.765265 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4d9f64aa-3e9d-422f-a81e-a22d00914728/nova-api-api/0.log" Mar 14 09:53:32 crc kubenswrapper[4886]: I0314 09:53:32.783653 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dl7z2_a2fc60c8-e486-4550-8b18-92a57ff62194/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:32 crc kubenswrapper[4886]: I0314 09:53:32.941759 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d907a09a-1258-4ed6-99a1-095050c8c378/nova-metadata-log/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.329535 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_211eef94-8537-4eaa-aae0-58b9697c7fac/mysql-bootstrap/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.403521 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_488dc047-1241-4d80-bd37-245c32e9bdd2/nova-scheduler-scheduler/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.420692 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:53:33 crc kubenswrapper[4886]: E0314 09:53:33.420947 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.562981 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_211eef94-8537-4eaa-aae0-58b9697c7fac/mysql-bootstrap/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.600539 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d907a09a-1258-4ed6-99a1-095050c8c378/nova-metadata-metadata/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.629859 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_211eef94-8537-4eaa-aae0-58b9697c7fac/galera/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.810288 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b/mysql-bootstrap/0.log" Mar 14 09:53:33 crc kubenswrapper[4886]: I0314 09:53:33.975948 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b/galera/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.099836 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0e9aeaad-4a28-4e40-ba63-4ea9fdeaea2b/mysql-bootstrap/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.112531 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_65b2ef9b-5c95-4936-8c8c-2abec26e2595/openstackclient/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.312837 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xt4bm_91b6fc57-0463-4654-8595-09cc5f7f0088/openstack-network-exporter/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.346483 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9k99l_dee9c638-1703-4b56-b366-13c6746d035c/ovn-controller/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.546607 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-slbpm_fa0129a2-aafc-4df4-9376-217bc5b6ee9c/ovsdb-server-init/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.682301 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-slbpm_fa0129a2-aafc-4df4-9376-217bc5b6ee9c/ovsdb-server-init/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.743084 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-slbpm_fa0129a2-aafc-4df4-9376-217bc5b6ee9c/ovsdb-server/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.743596 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-slbpm_fa0129a2-aafc-4df4-9376-217bc5b6ee9c/ovs-vswitchd/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.911250 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wklpz_f5062e4f-08e6-4fb3-b5f5-9938dd8633e8/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:34 crc kubenswrapper[4886]: I0314 09:53:34.977350 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56f90ac1-34d1-4101-b5cc-37e76200d22a/openstack-network-exporter/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.116690 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56f90ac1-34d1-4101-b5cc-37e76200d22a/ovn-northd/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.231688 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_014d60c5-6fb9-4259-8f9c-3ff44ff6781c/openstack-network-exporter/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.235876 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_014d60c5-6fb9-4259-8f9c-3ff44ff6781c/ovsdbserver-nb/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.413741 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f12a35c8-acb9-4410-9ebe-112b7c51885e/openstack-network-exporter/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.448961 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f12a35c8-acb9-4410-9ebe-112b7c51885e/ovsdbserver-sb/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.806914 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57dfd898bd-kzdvs_c7681da8-2f5d-4ac8-ae5e-6549a3d3f764/placement-api/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.822597 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6d632d59-7754-43d9-9a6a-1e818a26a715/init-config-reloader/0.log" Mar 14 09:53:35 crc kubenswrapper[4886]: I0314 09:53:35.941612 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57dfd898bd-kzdvs_c7681da8-2f5d-4ac8-ae5e-6549a3d3f764/placement-log/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.009671 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6d632d59-7754-43d9-9a6a-1e818a26a715/init-config-reloader/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.021586 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6d632d59-7754-43d9-9a6a-1e818a26a715/config-reloader/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.073037 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6d632d59-7754-43d9-9a6a-1e818a26a715/prometheus/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.251177 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f866eae5-fb12-4734-8906-aa868da61dd5/setup-container/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.253074 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6d632d59-7754-43d9-9a6a-1e818a26a715/thanos-sidecar/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.514332 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f866eae5-fb12-4734-8906-aa868da61dd5/setup-container/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.541278 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f866eae5-fb12-4734-8906-aa868da61dd5/rabbitmq/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.596467 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4aa00f0b-8e91-4a74-88de-56f7ecf55ee5/setup-container/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.836357 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4aa00f0b-8e91-4a74-88de-56f7ecf55ee5/setup-container/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.870503 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4aa00f0b-8e91-4a74-88de-56f7ecf55ee5/rabbitmq/0.log" Mar 14 09:53:36 crc kubenswrapper[4886]: I0314 09:53:36.954670 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cf6ck_b0b3b087-69b5-4953-a239-aade9af83aaa/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.095997 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-rwpq4_baedaad9-0945-4c50-9ca1-aa71c90e3298/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.201959 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-v89nz_a9632d51-2405-4118-a547-fc6a0e6e5c42/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.321310 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sp629_b1f2ef66-1863-484e-ba0e-d2ed4663453d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.432746 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-qgkmx_02fe85b9-cfc9-455d-ac92-ef1a3c16f729/ssh-known-hosts-edpm-deployment/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.681720 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-768658d555-ttc2f_69b57a2a-532a-4728-89d6-090f17edc7a7/proxy-server/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.796703 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-f4mxt_8658a67d-fe04-40af-a495-bb3d50c9a9db/swift-ring-rebalance/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.815539 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-768658d555-ttc2f_69b57a2a-532a-4728-89d6-090f17edc7a7/proxy-httpd/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.921706 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_17dec532-a4ac-466d-9b81-29a3e27c33bb/memcached/0.log" Mar 14 09:53:37 crc kubenswrapper[4886]: I0314 09:53:37.958084 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/account-auditor/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.039530 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/account-reaper/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.109004 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/account-replicator/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.129369 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/account-server/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.170466 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/container-auditor/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.210588 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/container-replicator/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.258438 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/container-server/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.321884 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/container-updater/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.351426 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/object-auditor/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.419621 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/object-expirer/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.461551 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/object-replicator/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.487904 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/object-server/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.555435 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/object-updater/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.572805 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/rsync/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.660643 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b13c5527-179a-440c-bca1-379cab773854/swift-recon-cron/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.801627 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-8bd42_e115c62e-eb7f-41a9-a613-7523bcfc2e90/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:38 crc kubenswrapper[4886]: I0314 09:53:38.944846 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c4cedac0-b804-4ea3-b548-f2871b24d70a/tempest-tests-tempest-tests-runner/0.log" Mar 14 09:53:39 crc kubenswrapper[4886]: I0314 09:53:39.001573 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0395a385-992c-4531-b626-3f7ad78db060/test-operator-logs-container/0.log" Mar 14 09:53:39 crc kubenswrapper[4886]: I0314 09:53:39.106221 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gh6pc_813f04db-4c33-4db3-a81a-a5617d8d460f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 14 09:53:39 crc kubenswrapper[4886]: I0314 09:53:39.795996 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_2b369685-07a0-4802-a5ed-d6288ed9b1c3/watcher-applier/0.log" Mar 14 09:53:40 crc kubenswrapper[4886]: I0314 09:53:40.355996 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_cf474a1d-5a57-4669-8ced-7d0c8decbd70/watcher-api-log/0.log" Mar 14 09:53:40 crc kubenswrapper[4886]: I0314 09:53:40.825072 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_f77ddcfc-3cbd-49a1-8f1b-d9de60483fc5/watcher-decision-engine/0.log" Mar 14 09:53:42 crc kubenswrapper[4886]: I0314 09:53:42.590085 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_cf474a1d-5a57-4669-8ced-7d0c8decbd70/watcher-api/0.log" Mar 14 09:53:45 crc kubenswrapper[4886]: I0314 09:53:45.432545 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:53:45 crc kubenswrapper[4886]: E0314 09:53:45.433603 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:53:57 crc kubenswrapper[4886]: I0314 09:53:57.420698 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:53:57 crc kubenswrapper[4886]: E0314 09:53:57.421622 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.151855 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558034-gmhtq"] Mar 14 09:54:00 crc kubenswrapper[4886]: E0314 09:54:00.152687 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462975bd-91d3-4454-b458-5f3c62842683" containerName="container-00" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.152701 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="462975bd-91d3-4454-b458-5f3c62842683" containerName="container-00" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.152920 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="462975bd-91d3-4454-b458-5f3c62842683" containerName="container-00" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.153680 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.156724 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.157008 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.157007 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.172804 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-gmhtq"] Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.254690 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8q2\" (UniqueName: \"kubernetes.io/projected/ede0fdd6-7212-41aa-9033-ac646f81c2de-kube-api-access-hg8q2\") pod \"auto-csr-approver-29558034-gmhtq\" (UID: \"ede0fdd6-7212-41aa-9033-ac646f81c2de\") " pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.358068 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8q2\" (UniqueName: \"kubernetes.io/projected/ede0fdd6-7212-41aa-9033-ac646f81c2de-kube-api-access-hg8q2\") pod \"auto-csr-approver-29558034-gmhtq\" (UID: \"ede0fdd6-7212-41aa-9033-ac646f81c2de\") " pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.382952 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8q2\" (UniqueName: \"kubernetes.io/projected/ede0fdd6-7212-41aa-9033-ac646f81c2de-kube-api-access-hg8q2\") pod \"auto-csr-approver-29558034-gmhtq\" (UID: \"ede0fdd6-7212-41aa-9033-ac646f81c2de\") " pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:00 crc kubenswrapper[4886]: I0314 09:54:00.486281 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:01 crc kubenswrapper[4886]: I0314 09:54:01.011484 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-gmhtq"] Mar 14 09:54:01 crc kubenswrapper[4886]: W0314 09:54:01.024310 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podede0fdd6_7212_41aa_9033_ac646f81c2de.slice/crio-716db14c65d7a3aedc3ac3d52ebe05387bd05235c2cb3bd5ccbd371851efeef5 WatchSource:0}: Error finding container 716db14c65d7a3aedc3ac3d52ebe05387bd05235c2cb3bd5ccbd371851efeef5: Status 404 returned error can't find the container with id 716db14c65d7a3aedc3ac3d52ebe05387bd05235c2cb3bd5ccbd371851efeef5 Mar 14 09:54:01 crc kubenswrapper[4886]: I0314 09:54:01.636425 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" event={"ID":"ede0fdd6-7212-41aa-9033-ac646f81c2de","Type":"ContainerStarted","Data":"716db14c65d7a3aedc3ac3d52ebe05387bd05235c2cb3bd5ccbd371851efeef5"} Mar 14 09:54:02 crc kubenswrapper[4886]: I0314 09:54:02.652164 4886 generic.go:334] "Generic (PLEG): container finished" podID="ede0fdd6-7212-41aa-9033-ac646f81c2de" containerID="e6c77489b03932a5aaf5b1fde18ad56dc49777b4c7f2293d9e90dc84b85c0155" exitCode=0 Mar 14 09:54:02 crc kubenswrapper[4886]: I0314 09:54:02.652780 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" event={"ID":"ede0fdd6-7212-41aa-9033-ac646f81c2de","Type":"ContainerDied","Data":"e6c77489b03932a5aaf5b1fde18ad56dc49777b4c7f2293d9e90dc84b85c0155"} Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.026885 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.042925 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg8q2\" (UniqueName: \"kubernetes.io/projected/ede0fdd6-7212-41aa-9033-ac646f81c2de-kube-api-access-hg8q2\") pod \"ede0fdd6-7212-41aa-9033-ac646f81c2de\" (UID: \"ede0fdd6-7212-41aa-9033-ac646f81c2de\") " Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.051457 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede0fdd6-7212-41aa-9033-ac646f81c2de-kube-api-access-hg8q2" (OuterVolumeSpecName: "kube-api-access-hg8q2") pod "ede0fdd6-7212-41aa-9033-ac646f81c2de" (UID: "ede0fdd6-7212-41aa-9033-ac646f81c2de"). InnerVolumeSpecName "kube-api-access-hg8q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.146598 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg8q2\" (UniqueName: \"kubernetes.io/projected/ede0fdd6-7212-41aa-9033-ac646f81c2de-kube-api-access-hg8q2\") on node \"crc\" DevicePath \"\"" Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.673705 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" event={"ID":"ede0fdd6-7212-41aa-9033-ac646f81c2de","Type":"ContainerDied","Data":"716db14c65d7a3aedc3ac3d52ebe05387bd05235c2cb3bd5ccbd371851efeef5"} Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.673818 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="716db14c65d7a3aedc3ac3d52ebe05387bd05235c2cb3bd5ccbd371851efeef5" Mar 14 09:54:04 crc kubenswrapper[4886]: I0314 09:54:04.673770 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558034-gmhtq" Mar 14 09:54:05 crc kubenswrapper[4886]: I0314 09:54:05.144142 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-v7hqx"] Mar 14 09:54:05 crc kubenswrapper[4886]: I0314 09:54:05.157337 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558028-v7hqx"] Mar 14 09:54:05 crc kubenswrapper[4886]: I0314 09:54:05.434289 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcf77a7-3bd8-492a-bed5-e65b2f8311e1" path="/var/lib/kubelet/pods/5bcf77a7-3bd8-492a-bed5-e65b2f8311e1/volumes" Mar 14 09:54:10 crc kubenswrapper[4886]: I0314 09:54:10.421398 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:54:10 crc kubenswrapper[4886]: E0314 09:54:10.422905 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.042622 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/util/0.log" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.287175 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/util/0.log" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.315876 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/pull/0.log" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.400221 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/pull/0.log" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.632471 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/extract/0.log" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.708604 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/util/0.log" Mar 14 09:54:13 crc kubenswrapper[4886]: I0314 09:54:13.836602 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0c34899c31d136eb5244838925eb376d053be9a7b0071e4a37a771132874w5g_4489283e-7c08-4b30-9efe-0d700167a4de/pull/0.log" Mar 14 09:54:14 crc kubenswrapper[4886]: I0314 09:54:14.020644 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-ch5j6_f220ca81-44c2-4dd1-8fff-616ed5060946/manager/0.log" Mar 14 09:54:14 crc kubenswrapper[4886]: I0314 09:54:14.784297 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-n42dq_5e8f3028-4d1b-4b90-910f-84c2f9e72f45/manager/0.log" Mar 14 09:54:14 crc kubenswrapper[4886]: I0314 09:54:14.922834 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-2knnh_4ac77887-a632-454d-9460-1150a439045a/manager/0.log" Mar 14 09:54:15 crc kubenswrapper[4886]: I0314 09:54:15.073324 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-8b92g_43749f37-8afe-4259-bf55-3e7842b14a14/manager/0.log" Mar 14 09:54:15 crc kubenswrapper[4886]: I0314 09:54:15.434816 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-mpz5b_b360b11e-b7a7-4b56-969f-3bef111a22b7/manager/0.log" Mar 14 09:54:15 crc kubenswrapper[4886]: I0314 09:54:15.563339 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-t477h_f9db1eb1-8050-466c-aa04-87ee4dc1479c/manager/0.log" Mar 14 09:54:15 crc kubenswrapper[4886]: I0314 09:54:15.884040 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-4v86k_e4a52d95-3fd2-48ad-8d53-cb790dbf34f6/manager/0.log" Mar 14 09:54:15 crc kubenswrapper[4886]: I0314 09:54:15.967607 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-nh5lk_84978a63-3814-485e-9902-7d041f79179d/manager/0.log" Mar 14 09:54:16 crc kubenswrapper[4886]: I0314 09:54:16.585554 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-lclvd_ed888cc0-96e8-4507-9659-1a710d2fcb41/manager/0.log" Mar 14 09:54:16 crc kubenswrapper[4886]: I0314 09:54:16.593966 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-kc4s8_243e7e8f-27ed-4052-b11f-51887ee5d8d7/manager/0.log" Mar 14 09:54:16 crc kubenswrapper[4886]: I0314 09:54:16.721490 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-q4jxl_61eb1055-3710-4b6b-81d2-40206feec055/manager/0.log" Mar 14 09:54:16 crc kubenswrapper[4886]: I0314 09:54:16.825263 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-kgkzn_4ad13bb8-51ec-43a6-b061-8485881110b0/manager/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.046749 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-v2dwg_8d2bbc98-8080-4e83-b31a-049a347cccb6/manager/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.120477 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-q8d2w_b74c6843-f5c8-465e-aa1e-350d6329567d/manager/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.292649 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7ml6pn_7c968620-eafc-42fe-b2dc-a86b4fa845d5/manager/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.437604 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-657bc7dd6f-zd6k9_6fe04296-377c-43a6-af79-eed85f760bf9/operator/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.589165 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tnc57_0941dd67-4b9c-4337-b9b1-fe329f6c22fc/registry-server/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.725526 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-vp756_4c49fd84-e358-4ec4-a05c-8c3728dc1824/manager/0.log" Mar 14 09:54:17 crc kubenswrapper[4886]: I0314 09:54:17.928038 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-sx68d_28c956a0-8e35-4f54-a453-f837ada794c7/manager/0.log" Mar 14 09:54:18 crc kubenswrapper[4886]: I0314 09:54:18.019310 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2ghpz_e579f2af-b976-45fe-ac83-a23c0676eaf2/operator/0.log" Mar 14 09:54:18 crc kubenswrapper[4886]: I0314 09:54:18.306542 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-kh5qj_d029e96c-9b5c-4095-86bc-bcac7b633fe5/manager/0.log" Mar 14 09:54:18 crc kubenswrapper[4886]: I0314 09:54:18.477348 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-f4tbn_f24830d3-8cff-4071-952d-9065c1c39e4a/manager/0.log" Mar 14 09:54:18 crc kubenswrapper[4886]: I0314 09:54:18.595560 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6854b8b9d9-ssmck_28443df6-3421-46cb-9011-d8b47769fbfa/manager/0.log" Mar 14 09:54:18 crc kubenswrapper[4886]: I0314 09:54:18.759039 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f57d95748-862jt_79e45051-6db9-4014-98f4-58bbddbb2edc/manager/0.log" Mar 14 09:54:18 crc kubenswrapper[4886]: I0314 09:54:18.941961 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65755f6b77-wdwbk_c521acb7-75ce-466f-90b7-caf5265ed209/manager/0.log" Mar 14 09:54:25 crc kubenswrapper[4886]: I0314 09:54:25.428741 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:54:25 crc kubenswrapper[4886]: E0314 09:54:25.429604 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:54:36 crc kubenswrapper[4886]: I0314 09:54:36.422156 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:54:36 crc kubenswrapper[4886]: E0314 09:54:36.423456 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:54:41 crc kubenswrapper[4886]: I0314 09:54:41.069704 4886 scope.go:117] "RemoveContainer" containerID="7fc74959576499b1935c562a3461ec76a62eae9c5ee822da85b97a77ae866a33" Mar 14 09:54:42 crc kubenswrapper[4886]: I0314 09:54:42.722059 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lxn5q_3c3e4726-bb4a-45be-9c3a-a791c4a42380/control-plane-machine-set-operator/0.log" Mar 14 09:54:42 crc kubenswrapper[4886]: I0314 09:54:42.876527 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dq2b2_8107785d-acc9-4fdf-8f93-21f2b4a62c61/kube-rbac-proxy/0.log" Mar 14 09:54:42 crc kubenswrapper[4886]: I0314 09:54:42.884559 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dq2b2_8107785d-acc9-4fdf-8f93-21f2b4a62c61/machine-api-operator/0.log" Mar 14 09:54:51 crc kubenswrapper[4886]: I0314 09:54:51.421279 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:54:51 crc kubenswrapper[4886]: E0314 09:54:51.422144 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:54:55 crc kubenswrapper[4886]: I0314 09:54:55.111918 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jhphs_a94da143-3ce9-4783-9a37-51dc56105745/cert-manager-controller/0.log" Mar 14 09:54:55 crc kubenswrapper[4886]: I0314 09:54:55.213743 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-4gn2q_f2da9034-3cb3-453f-acba-8e2c65138035/cert-manager-cainjector/0.log" Mar 14 09:54:55 crc kubenswrapper[4886]: I0314 09:54:55.280834 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-wzgzv_dd81d1ee-7b0d-49e7-955b-3b48b78ed81d/cert-manager-webhook/0.log" Mar 14 09:55:05 crc kubenswrapper[4886]: I0314 09:55:05.427166 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:55:05 crc kubenswrapper[4886]: E0314 09:55:05.428583 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:55:09 crc kubenswrapper[4886]: I0314 09:55:09.548050 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-stbth_c42d4bb8-a937-4aaa-a074-378bc2f47190/nmstate-console-plugin/0.log" Mar 14 09:55:09 crc kubenswrapper[4886]: I0314 09:55:09.729000 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zxg99_ac12ecfd-89d4-41da-a48d-b4b8758afe14/nmstate-handler/0.log" Mar 14 09:55:09 crc kubenswrapper[4886]: I0314 09:55:09.783161 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-78jgt_6ee1a88c-f687-4369-ac5a-271fccaa1374/kube-rbac-proxy/0.log" Mar 14 09:55:09 crc kubenswrapper[4886]: I0314 09:55:09.873239 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-gjn86_d45c2fd4-e4cb-4e0e-ab34-4593293d6829/nmstate-operator/0.log" Mar 14 09:55:09 crc kubenswrapper[4886]: I0314 09:55:09.897182 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-78jgt_6ee1a88c-f687-4369-ac5a-271fccaa1374/nmstate-metrics/0.log" Mar 14 09:55:10 crc kubenswrapper[4886]: I0314 09:55:10.039914 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-hjgxb_8f191943-ca71-4273-8a6d-153c6871ab56/nmstate-webhook/0.log" Mar 14 09:55:18 crc kubenswrapper[4886]: I0314 09:55:18.421383 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:55:18 crc kubenswrapper[4886]: E0314 09:55:18.422100 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:55:23 crc kubenswrapper[4886]: I0314 09:55:23.202503 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-xv587_c5e9d625-22c2-434a-ba28-8c7d774dc4fb/prometheus-operator/0.log" Mar 14 09:55:23 crc kubenswrapper[4886]: I0314 09:55:23.322418 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj_bb0232bb-45bd-4f84-8b22-1f51604204f7/prometheus-operator-admission-webhook/0.log" Mar 14 09:55:23 crc kubenswrapper[4886]: I0314 09:55:23.394747 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-74f4f68674-mr59p_db69019c-bb09-4732-98ee-c5bb11ab7827/prometheus-operator-admission-webhook/0.log" Mar 14 09:55:23 crc kubenswrapper[4886]: I0314 09:55:23.575240 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-bxpv7_a5156b1d-96a6-46a7-8142-adc240ccd902/operator/0.log" Mar 14 09:55:23 crc kubenswrapper[4886]: I0314 09:55:23.599156 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2k7dn_ee03958c-108f-48e8-b3ca-c3bd13bfda4a/perses-operator/0.log" Mar 14 09:55:32 crc kubenswrapper[4886]: I0314 09:55:32.421273 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:55:32 crc kubenswrapper[4886]: E0314 09:55:32.423499 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:55:39 crc kubenswrapper[4886]: I0314 09:55:39.621351 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qw82c_02c262ba-c8d2-4321-858a-eeb91709b8fc/kube-rbac-proxy/0.log" Mar 14 09:55:39 crc kubenswrapper[4886]: I0314 09:55:39.778862 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qw82c_02c262ba-c8d2-4321-858a-eeb91709b8fc/controller/0.log" Mar 14 09:55:39 crc kubenswrapper[4886]: I0314 09:55:39.824091 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-frr-files/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.097391 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-frr-files/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.124627 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-metrics/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.148587 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-reloader/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.166577 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-reloader/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.340667 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-metrics/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.368758 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-frr-files/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.380315 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-reloader/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.588244 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-metrics/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.737263 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-metrics/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.743773 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-reloader/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.752061 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/cp-frr-files/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.809222 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/controller/0.log" Mar 14 09:55:40 crc kubenswrapper[4886]: I0314 09:55:40.994834 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/kube-rbac-proxy/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.011937 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/frr-metrics/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.012315 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/kube-rbac-proxy-frr/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.217027 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/reloader/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.318575 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-9qwls_c90b844b-a223-4b46-93ea-ae5826e1d282/frr-k8s-webhook-server/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.531380 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68fdc74fdb-z8p2f_69949dc3-8b82-427f-a2f9-35ca7fa9edef/manager/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.730274 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d77c85fb9-g9wsd_066d6d6d-3ace-4ed9-9a47-e7accd7645c8/webhook-server/0.log" Mar 14 09:55:41 crc kubenswrapper[4886]: I0314 09:55:41.804975 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8qrlt_d03e38bf-73ce-44a1-8b84-8e23d2e33a86/kube-rbac-proxy/0.log" Mar 14 09:55:42 crc kubenswrapper[4886]: I0314 09:55:42.428795 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8qrlt_d03e38bf-73ce-44a1-8b84-8e23d2e33a86/speaker/0.log" Mar 14 09:55:42 crc kubenswrapper[4886]: I0314 09:55:42.818467 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-54lvd_a1dc9df2-9dd0-40af-9508-b65d1047b045/frr/0.log" Mar 14 09:55:44 crc kubenswrapper[4886]: I0314 09:55:44.425164 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:55:44 crc kubenswrapper[4886]: E0314 09:55:44.426172 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:55:55 crc kubenswrapper[4886]: I0314 09:55:55.427413 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:55:55 crc kubenswrapper[4886]: E0314 09:55:55.428776 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.148578 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/util/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.349357 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/util/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.352940 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/pull/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.375829 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/pull/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.560560 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/pull/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.598637 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/extract/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.606558 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874gblzm_db8bb544-5cf5-442c-adda-a2bf39bb77ee/util/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.732541 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/util/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.892342 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/util/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.893676 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/pull/0.log" Mar 14 09:55:59 crc kubenswrapper[4886]: I0314 09:55:59.895445 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/pull/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.040952 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/util/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.057499 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/pull/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.084402 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1n6krl_2346929a-43ff-4471-a6cd-ff439f1e69f0/extract/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.143620 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558036-ds65w"] Mar 14 09:56:00 crc kubenswrapper[4886]: E0314 09:56:00.144411 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede0fdd6-7212-41aa-9033-ac646f81c2de" containerName="oc" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.144436 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede0fdd6-7212-41aa-9033-ac646f81c2de" containerName="oc" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.144681 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede0fdd6-7212-41aa-9033-ac646f81c2de" containerName="oc" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.145381 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.147846 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.148163 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.151238 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.155898 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-ds65w"] Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.217785 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxpc\" (UniqueName: \"kubernetes.io/projected/41f5d35e-101a-4e0d-81fa-3011d175b80a-kube-api-access-hdxpc\") pod \"auto-csr-approver-29558036-ds65w\" (UID: \"41f5d35e-101a-4e0d-81fa-3011d175b80a\") " pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.260014 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/util/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.319676 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxpc\" (UniqueName: \"kubernetes.io/projected/41f5d35e-101a-4e0d-81fa-3011d175b80a-kube-api-access-hdxpc\") pod \"auto-csr-approver-29558036-ds65w\" (UID: \"41f5d35e-101a-4e0d-81fa-3011d175b80a\") " pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.350448 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxpc\" (UniqueName: \"kubernetes.io/projected/41f5d35e-101a-4e0d-81fa-3011d175b80a-kube-api-access-hdxpc\") pod \"auto-csr-approver-29558036-ds65w\" (UID: \"41f5d35e-101a-4e0d-81fa-3011d175b80a\") " pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.484596 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/pull/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.489008 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/pull/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.501092 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/util/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.514290 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.676716 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/util/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.700113 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/extract/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.755310 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gndxm_00bb3cfb-439a-4613-958d-c528ed85df78/pull/0.log" Mar 14 09:56:00 crc kubenswrapper[4886]: I0314 09:56:00.914561 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/extract-utilities/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.013670 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-ds65w"] Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.102590 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/extract-utilities/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.150880 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/extract-content/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.164718 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/extract-content/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.323154 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/extract-content/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.399486 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/extract-utilities/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.523840 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/extract-utilities/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.772687 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/extract-utilities/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.788605 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/extract-content/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.845539 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/extract-content/0.log" Mar 14 09:56:01 crc kubenswrapper[4886]: I0314 09:56:01.921233 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558036-ds65w" event={"ID":"41f5d35e-101a-4e0d-81fa-3011d175b80a","Type":"ContainerStarted","Data":"93d40eb256e474c8ec684650fbe3b65bc387fefcea39918773f6052ada597c3b"} Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.103338 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/extract-utilities/0.log" Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.103451 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/extract-content/0.log" Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.237461 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5kmf_54007e16-87f1-4fed-a62b-d4936c61998c/registry-server/0.log" Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.344718 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-v6gf2_f0bdc4aa-1cef-4951-9c86-47f00e9bc18b/marketplace-operator/0.log" Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.842874 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/extract-utilities/0.log" Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.949543 4886 generic.go:334] "Generic (PLEG): container finished" podID="41f5d35e-101a-4e0d-81fa-3011d175b80a" containerID="6425af967c55694fe45056b5cc133baaa02adb6b2634349ba1b2c80e5d76ebe5" exitCode=0 Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.949600 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558036-ds65w" event={"ID":"41f5d35e-101a-4e0d-81fa-3011d175b80a","Type":"ContainerDied","Data":"6425af967c55694fe45056b5cc133baaa02adb6b2634349ba1b2c80e5d76ebe5"} Mar 14 09:56:02 crc kubenswrapper[4886]: I0314 09:56:02.956581 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ld96m_c3268114-3be9-4c4c-b225-1c9024a5b341/registry-server/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.036010 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/extract-utilities/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.088968 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/extract-content/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.107651 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/extract-content/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.482704 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/extract-utilities/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.486436 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/extract-content/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.541968 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/extract-utilities/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.650536 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k6gs4_ed6613ed-adc1-4752-8bbd-e01372ddae6d/registry-server/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.724884 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/extract-content/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.726901 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/extract-utilities/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.762285 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/extract-content/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.941186 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/extract-utilities/0.log" Mar 14 09:56:03 crc kubenswrapper[4886]: I0314 09:56:03.941351 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/extract-content/0.log" Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.588032 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.604321 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdxpc\" (UniqueName: \"kubernetes.io/projected/41f5d35e-101a-4e0d-81fa-3011d175b80a-kube-api-access-hdxpc\") pod \"41f5d35e-101a-4e0d-81fa-3011d175b80a\" (UID: \"41f5d35e-101a-4e0d-81fa-3011d175b80a\") " Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.621363 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f5d35e-101a-4e0d-81fa-3011d175b80a-kube-api-access-hdxpc" (OuterVolumeSpecName: "kube-api-access-hdxpc") pod "41f5d35e-101a-4e0d-81fa-3011d175b80a" (UID: "41f5d35e-101a-4e0d-81fa-3011d175b80a"). InnerVolumeSpecName "kube-api-access-hdxpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.685745 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mwvfs_1586ae9d-89e0-4d92-8fbc-99a6d4ec3111/registry-server/0.log" Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.706690 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdxpc\" (UniqueName: \"kubernetes.io/projected/41f5d35e-101a-4e0d-81fa-3011d175b80a-kube-api-access-hdxpc\") on node \"crc\" DevicePath \"\"" Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.967674 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558036-ds65w" event={"ID":"41f5d35e-101a-4e0d-81fa-3011d175b80a","Type":"ContainerDied","Data":"93d40eb256e474c8ec684650fbe3b65bc387fefcea39918773f6052ada597c3b"} Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.967710 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d40eb256e474c8ec684650fbe3b65bc387fefcea39918773f6052ada597c3b" Mar 14 09:56:04 crc kubenswrapper[4886]: I0314 09:56:04.967712 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558036-ds65w" Mar 14 09:56:05 crc kubenswrapper[4886]: I0314 09:56:05.674176 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-5fj7d"] Mar 14 09:56:05 crc kubenswrapper[4886]: I0314 09:56:05.691470 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558030-5fj7d"] Mar 14 09:56:07 crc kubenswrapper[4886]: I0314 09:56:07.431508 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d25cee-7775-463f-b63f-3bf9e5b91e9b" path="/var/lib/kubelet/pods/93d25cee-7775-463f-b63f-3bf9e5b91e9b/volumes" Mar 14 09:56:09 crc kubenswrapper[4886]: I0314 09:56:09.421243 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:56:09 crc kubenswrapper[4886]: E0314 09:56:09.422674 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:56:17 crc kubenswrapper[4886]: I0314 09:56:17.413203 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-xv587_c5e9d625-22c2-434a-ba28-8c7d774dc4fb/prometheus-operator/0.log" Mar 14 09:56:17 crc kubenswrapper[4886]: I0314 09:56:17.464890 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-74f4f68674-5lkjj_bb0232bb-45bd-4f84-8b22-1f51604204f7/prometheus-operator-admission-webhook/0.log" Mar 14 09:56:17 crc kubenswrapper[4886]: I0314 09:56:17.473320 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-74f4f68674-mr59p_db69019c-bb09-4732-98ee-c5bb11ab7827/prometheus-operator-admission-webhook/0.log" Mar 14 09:56:17 crc kubenswrapper[4886]: I0314 09:56:17.604002 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-bxpv7_a5156b1d-96a6-46a7-8142-adc240ccd902/operator/0.log" Mar 14 09:56:17 crc kubenswrapper[4886]: I0314 09:56:17.683644 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2k7dn_ee03958c-108f-48e8-b3ca-c3bd13bfda4a/perses-operator/0.log" Mar 14 09:56:21 crc kubenswrapper[4886]: E0314 09:56:21.174947 4886 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.38:48600->38.102.83.38:33711: write tcp 38.102.83.38:48600->38.102.83.38:33711: write: connection reset by peer Mar 14 09:56:22 crc kubenswrapper[4886]: I0314 09:56:22.420645 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:56:22 crc kubenswrapper[4886]: E0314 09:56:22.421113 4886 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ddctv_openshift-machine-config-operator(64517238-bfef-43e1-b543-1eea5b7f9c79)\"" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" Mar 14 09:56:34 crc kubenswrapper[4886]: I0314 09:56:34.421482 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 09:56:35 crc kubenswrapper[4886]: I0314 09:56:35.234602 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"a30b0ee975fdbbbf675d125dd6dbe0cfa4a66bc19ea06b9da11724ca219da3c6"} Mar 14 09:56:41 crc kubenswrapper[4886]: I0314 09:56:41.198352 4886 scope.go:117] "RemoveContainer" containerID="d116af96eb5cceca38eee84a0eac7e8ea5641140a3c915c2304eb79d1806d979" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.318726 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n6h4r"] Mar 14 09:57:16 crc kubenswrapper[4886]: E0314 09:57:16.319969 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f5d35e-101a-4e0d-81fa-3011d175b80a" containerName="oc" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.319988 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f5d35e-101a-4e0d-81fa-3011d175b80a" containerName="oc" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.320297 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f5d35e-101a-4e0d-81fa-3011d175b80a" containerName="oc" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.322619 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.340784 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6h4r"] Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.378077 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hz9\" (UniqueName: \"kubernetes.io/projected/58aa2406-819e-4b03-9039-441af73335cb-kube-api-access-m9hz9\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.378169 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-catalog-content\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.378202 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-utilities\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.480375 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hz9\" (UniqueName: \"kubernetes.io/projected/58aa2406-819e-4b03-9039-441af73335cb-kube-api-access-m9hz9\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.480447 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-catalog-content\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.480525 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-utilities\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.481043 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-utilities\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.481772 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-catalog-content\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.520441 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hz9\" (UniqueName: \"kubernetes.io/projected/58aa2406-819e-4b03-9039-441af73335cb-kube-api-access-m9hz9\") pod \"certified-operators-n6h4r\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:16 crc kubenswrapper[4886]: I0314 09:57:16.667408 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:17 crc kubenswrapper[4886]: I0314 09:57:17.215238 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n6h4r"] Mar 14 09:57:17 crc kubenswrapper[4886]: I0314 09:57:17.704175 4886 generic.go:334] "Generic (PLEG): container finished" podID="58aa2406-819e-4b03-9039-441af73335cb" containerID="2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b" exitCode=0 Mar 14 09:57:17 crc kubenswrapper[4886]: I0314 09:57:17.704263 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerDied","Data":"2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b"} Mar 14 09:57:17 crc kubenswrapper[4886]: I0314 09:57:17.704542 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerStarted","Data":"71090b920e61e200fb518bd9e5015645e5dcf2ac9f528677cd531cb989aaf35b"} Mar 14 09:57:17 crc kubenswrapper[4886]: I0314 09:57:17.710652 4886 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 14 09:57:18 crc kubenswrapper[4886]: I0314 09:57:18.719686 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerStarted","Data":"565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649"} Mar 14 09:57:19 crc kubenswrapper[4886]: I0314 09:57:19.730133 4886 generic.go:334] "Generic (PLEG): container finished" podID="58aa2406-819e-4b03-9039-441af73335cb" containerID="565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649" exitCode=0 Mar 14 09:57:19 crc kubenswrapper[4886]: I0314 09:57:19.730209 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerDied","Data":"565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649"} Mar 14 09:57:20 crc kubenswrapper[4886]: I0314 09:57:20.757823 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerStarted","Data":"03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57"} Mar 14 09:57:20 crc kubenswrapper[4886]: I0314 09:57:20.795010 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n6h4r" podStartSLOduration=2.324406937 podStartE2EDuration="4.794973207s" podCreationTimestamp="2026-03-14 09:57:16 +0000 UTC" firstStartedPulling="2026-03-14 09:57:17.710054113 +0000 UTC m=+5372.958505800" lastFinishedPulling="2026-03-14 09:57:20.180620433 +0000 UTC m=+5375.429072070" observedRunningTime="2026-03-14 09:57:20.788032931 +0000 UTC m=+5376.036484608" watchObservedRunningTime="2026-03-14 09:57:20.794973207 +0000 UTC m=+5376.043424874" Mar 14 09:57:26 crc kubenswrapper[4886]: I0314 09:57:26.668347 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:26 crc kubenswrapper[4886]: I0314 09:57:26.668952 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:26 crc kubenswrapper[4886]: I0314 09:57:26.713252 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:26 crc kubenswrapper[4886]: I0314 09:57:26.867802 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:26 crc kubenswrapper[4886]: I0314 09:57:26.961999 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6h4r"] Mar 14 09:57:28 crc kubenswrapper[4886]: I0314 09:57:28.848704 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n6h4r" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="registry-server" containerID="cri-o://03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57" gracePeriod=2 Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.388396 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.466925 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9hz9\" (UniqueName: \"kubernetes.io/projected/58aa2406-819e-4b03-9039-441af73335cb-kube-api-access-m9hz9\") pod \"58aa2406-819e-4b03-9039-441af73335cb\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.467108 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-utilities\") pod \"58aa2406-819e-4b03-9039-441af73335cb\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.467273 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-catalog-content\") pod \"58aa2406-819e-4b03-9039-441af73335cb\" (UID: \"58aa2406-819e-4b03-9039-441af73335cb\") " Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.467903 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-utilities" (OuterVolumeSpecName: "utilities") pod "58aa2406-819e-4b03-9039-441af73335cb" (UID: "58aa2406-819e-4b03-9039-441af73335cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.485671 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58aa2406-819e-4b03-9039-441af73335cb-kube-api-access-m9hz9" (OuterVolumeSpecName: "kube-api-access-m9hz9") pod "58aa2406-819e-4b03-9039-441af73335cb" (UID: "58aa2406-819e-4b03-9039-441af73335cb"). InnerVolumeSpecName "kube-api-access-m9hz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.525189 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58aa2406-819e-4b03-9039-441af73335cb" (UID: "58aa2406-819e-4b03-9039-441af73335cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.569650 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9hz9\" (UniqueName: \"kubernetes.io/projected/58aa2406-819e-4b03-9039-441af73335cb-kube-api-access-m9hz9\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.569683 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.569695 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58aa2406-819e-4b03-9039-441af73335cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.860560 4886 generic.go:334] "Generic (PLEG): container finished" podID="58aa2406-819e-4b03-9039-441af73335cb" containerID="03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57" exitCode=0 Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.860715 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerDied","Data":"03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57"} Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.860821 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n6h4r" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.861112 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n6h4r" event={"ID":"58aa2406-819e-4b03-9039-441af73335cb","Type":"ContainerDied","Data":"71090b920e61e200fb518bd9e5015645e5dcf2ac9f528677cd531cb989aaf35b"} Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.861259 4886 scope.go:117] "RemoveContainer" containerID="03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.897731 4886 scope.go:117] "RemoveContainer" containerID="565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.898987 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n6h4r"] Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.909902 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n6h4r"] Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.927676 4886 scope.go:117] "RemoveContainer" containerID="2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.974745 4886 scope.go:117] "RemoveContainer" containerID="03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57" Mar 14 09:57:29 crc kubenswrapper[4886]: E0314 09:57:29.975371 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57\": container with ID starting with 03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57 not found: ID does not exist" containerID="03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.975420 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57"} err="failed to get container status \"03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57\": rpc error: code = NotFound desc = could not find container \"03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57\": container with ID starting with 03a4c9a68b269cb3cc64d9d093ed1edc1e9e3babc4cee8070e34d53b414d4c57 not found: ID does not exist" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.975448 4886 scope.go:117] "RemoveContainer" containerID="565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649" Mar 14 09:57:29 crc kubenswrapper[4886]: E0314 09:57:29.975796 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649\": container with ID starting with 565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649 not found: ID does not exist" containerID="565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.975832 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649"} err="failed to get container status \"565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649\": rpc error: code = NotFound desc = could not find container \"565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649\": container with ID starting with 565d76a3857f95f276694a53f9ba86b7dce422b524c002273848f67c0079e649 not found: ID does not exist" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.975876 4886 scope.go:117] "RemoveContainer" containerID="2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b" Mar 14 09:57:29 crc kubenswrapper[4886]: E0314 09:57:29.976534 4886 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b\": container with ID starting with 2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b not found: ID does not exist" containerID="2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b" Mar 14 09:57:29 crc kubenswrapper[4886]: I0314 09:57:29.976586 4886 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b"} err="failed to get container status \"2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b\": rpc error: code = NotFound desc = could not find container \"2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b\": container with ID starting with 2fab19d3a9a1279a1b6b5af5a4c3766f0564c30ad0386891181135046116e33b not found: ID does not exist" Mar 14 09:57:31 crc kubenswrapper[4886]: I0314 09:57:31.433976 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58aa2406-819e-4b03-9039-441af73335cb" path="/var/lib/kubelet/pods/58aa2406-819e-4b03-9039-441af73335cb/volumes" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.165212 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558038-ld7z6"] Mar 14 09:58:00 crc kubenswrapper[4886]: E0314 09:58:00.166374 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="extract-utilities" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.166390 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="extract-utilities" Mar 14 09:58:00 crc kubenswrapper[4886]: E0314 09:58:00.166425 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="extract-content" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.166432 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="extract-content" Mar 14 09:58:00 crc kubenswrapper[4886]: E0314 09:58:00.166466 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="registry-server" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.166473 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="registry-server" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.166680 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="58aa2406-819e-4b03-9039-441af73335cb" containerName="registry-server" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.167452 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.170542 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.170761 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.171679 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.186864 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558038-ld7z6"] Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.279387 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9w9\" (UniqueName: \"kubernetes.io/projected/27f37eaf-7350-41a1-9e08-3dd75728ec79-kube-api-access-xz9w9\") pod \"auto-csr-approver-29558038-ld7z6\" (UID: \"27f37eaf-7350-41a1-9e08-3dd75728ec79\") " pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.382516 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9w9\" (UniqueName: \"kubernetes.io/projected/27f37eaf-7350-41a1-9e08-3dd75728ec79-kube-api-access-xz9w9\") pod \"auto-csr-approver-29558038-ld7z6\" (UID: \"27f37eaf-7350-41a1-9e08-3dd75728ec79\") " pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.419386 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9w9\" (UniqueName: \"kubernetes.io/projected/27f37eaf-7350-41a1-9e08-3dd75728ec79-kube-api-access-xz9w9\") pod \"auto-csr-approver-29558038-ld7z6\" (UID: \"27f37eaf-7350-41a1-9e08-3dd75728ec79\") " pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.501648 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:00 crc kubenswrapper[4886]: I0314 09:58:00.981873 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558038-ld7z6"] Mar 14 09:58:01 crc kubenswrapper[4886]: I0314 09:58:01.222310 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" event={"ID":"27f37eaf-7350-41a1-9e08-3dd75728ec79","Type":"ContainerStarted","Data":"2ad18e7625e07ed24a9481665155d65b54906cd576e6d065be1d7a1ebc4bb525"} Mar 14 09:58:02 crc kubenswrapper[4886]: I0314 09:58:02.232595 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" event={"ID":"27f37eaf-7350-41a1-9e08-3dd75728ec79","Type":"ContainerStarted","Data":"5d37e14604d5d4701e00482d18c5e73e7a2325722943bb1285d5ca61ad26b79e"} Mar 14 09:58:02 crc kubenswrapper[4886]: I0314 09:58:02.258421 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" podStartSLOduration=1.527527689 podStartE2EDuration="2.258403918s" podCreationTimestamp="2026-03-14 09:58:00 +0000 UTC" firstStartedPulling="2026-03-14 09:58:00.989802501 +0000 UTC m=+5416.238254138" lastFinishedPulling="2026-03-14 09:58:01.72067873 +0000 UTC m=+5416.969130367" observedRunningTime="2026-03-14 09:58:02.24540846 +0000 UTC m=+5417.493860107" watchObservedRunningTime="2026-03-14 09:58:02.258403918 +0000 UTC m=+5417.506855555" Mar 14 09:58:03 crc kubenswrapper[4886]: I0314 09:58:03.248220 4886 generic.go:334] "Generic (PLEG): container finished" podID="27f37eaf-7350-41a1-9e08-3dd75728ec79" containerID="5d37e14604d5d4701e00482d18c5e73e7a2325722943bb1285d5ca61ad26b79e" exitCode=0 Mar 14 09:58:03 crc kubenswrapper[4886]: I0314 09:58:03.248343 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" event={"ID":"27f37eaf-7350-41a1-9e08-3dd75728ec79","Type":"ContainerDied","Data":"5d37e14604d5d4701e00482d18c5e73e7a2325722943bb1285d5ca61ad26b79e"} Mar 14 09:58:04 crc kubenswrapper[4886]: I0314 09:58:04.799185 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:04 crc kubenswrapper[4886]: I0314 09:58:04.931759 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz9w9\" (UniqueName: \"kubernetes.io/projected/27f37eaf-7350-41a1-9e08-3dd75728ec79-kube-api-access-xz9w9\") pod \"27f37eaf-7350-41a1-9e08-3dd75728ec79\" (UID: \"27f37eaf-7350-41a1-9e08-3dd75728ec79\") " Mar 14 09:58:04 crc kubenswrapper[4886]: I0314 09:58:04.943511 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f37eaf-7350-41a1-9e08-3dd75728ec79-kube-api-access-xz9w9" (OuterVolumeSpecName: "kube-api-access-xz9w9") pod "27f37eaf-7350-41a1-9e08-3dd75728ec79" (UID: "27f37eaf-7350-41a1-9e08-3dd75728ec79"). InnerVolumeSpecName "kube-api-access-xz9w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.033894 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz9w9\" (UniqueName: \"kubernetes.io/projected/27f37eaf-7350-41a1-9e08-3dd75728ec79-kube-api-access-xz9w9\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.274653 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" event={"ID":"27f37eaf-7350-41a1-9e08-3dd75728ec79","Type":"ContainerDied","Data":"2ad18e7625e07ed24a9481665155d65b54906cd576e6d065be1d7a1ebc4bb525"} Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.274717 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad18e7625e07ed24a9481665155d65b54906cd576e6d065be1d7a1ebc4bb525" Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.274719 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558038-ld7z6" Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.360516 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-mjzmw"] Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.372488 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558032-mjzmw"] Mar 14 09:58:05 crc kubenswrapper[4886]: I0314 09:58:05.481962 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a71a4e8-e737-4e94-aef8-477b1570338a" path="/var/lib/kubelet/pods/6a71a4e8-e737-4e94-aef8-477b1570338a/volumes" Mar 14 09:58:15 crc kubenswrapper[4886]: I0314 09:58:15.387452 4886 generic.go:334] "Generic (PLEG): container finished" podID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerID="dd2ceb5a1285b2c3176d96a3ab1ab3a68369c87a8d3b736449d70140e379e0d2" exitCode=0 Mar 14 09:58:15 crc kubenswrapper[4886]: I0314 09:58:15.387593 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tffgj/must-gather-znzwx" event={"ID":"6265b1bc-308a-49bf-9077-8d14626dc31a","Type":"ContainerDied","Data":"dd2ceb5a1285b2c3176d96a3ab1ab3a68369c87a8d3b736449d70140e379e0d2"} Mar 14 09:58:15 crc kubenswrapper[4886]: I0314 09:58:15.389150 4886 scope.go:117] "RemoveContainer" containerID="dd2ceb5a1285b2c3176d96a3ab1ab3a68369c87a8d3b736449d70140e379e0d2" Mar 14 09:58:16 crc kubenswrapper[4886]: I0314 09:58:16.361044 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tffgj_must-gather-znzwx_6265b1bc-308a-49bf-9077-8d14626dc31a/gather/0.log" Mar 14 09:58:24 crc kubenswrapper[4886]: I0314 09:58:24.853165 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tffgj/must-gather-znzwx"] Mar 14 09:58:24 crc kubenswrapper[4886]: I0314 09:58:24.854056 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-tffgj/must-gather-znzwx" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="copy" containerID="cri-o://5d08614f38f90ffef832cb2af337dc92add88e064291d5c574d3699fa4351c47" gracePeriod=2 Mar 14 09:58:24 crc kubenswrapper[4886]: I0314 09:58:24.863882 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tffgj/must-gather-znzwx"] Mar 14 09:58:25 crc kubenswrapper[4886]: I0314 09:58:25.505893 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tffgj_must-gather-znzwx_6265b1bc-308a-49bf-9077-8d14626dc31a/copy/0.log" Mar 14 09:58:25 crc kubenswrapper[4886]: I0314 09:58:25.507386 4886 generic.go:334] "Generic (PLEG): container finished" podID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerID="5d08614f38f90ffef832cb2af337dc92add88e064291d5c574d3699fa4351c47" exitCode=143 Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.212410 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tffgj_must-gather-znzwx_6265b1bc-308a-49bf-9077-8d14626dc31a/copy/0.log" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.213595 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.319211 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6265b1bc-308a-49bf-9077-8d14626dc31a-must-gather-output\") pod \"6265b1bc-308a-49bf-9077-8d14626dc31a\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.319352 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5c8h\" (UniqueName: \"kubernetes.io/projected/6265b1bc-308a-49bf-9077-8d14626dc31a-kube-api-access-d5c8h\") pod \"6265b1bc-308a-49bf-9077-8d14626dc31a\" (UID: \"6265b1bc-308a-49bf-9077-8d14626dc31a\") " Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.326282 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6265b1bc-308a-49bf-9077-8d14626dc31a-kube-api-access-d5c8h" (OuterVolumeSpecName: "kube-api-access-d5c8h") pod "6265b1bc-308a-49bf-9077-8d14626dc31a" (UID: "6265b1bc-308a-49bf-9077-8d14626dc31a"). InnerVolumeSpecName "kube-api-access-d5c8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.421776 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5c8h\" (UniqueName: \"kubernetes.io/projected/6265b1bc-308a-49bf-9077-8d14626dc31a-kube-api-access-d5c8h\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.518737 4886 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tffgj_must-gather-znzwx_6265b1bc-308a-49bf-9077-8d14626dc31a/copy/0.log" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.519317 4886 scope.go:117] "RemoveContainer" containerID="5d08614f38f90ffef832cb2af337dc92add88e064291d5c574d3699fa4351c47" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.519481 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tffgj/must-gather-znzwx" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.530273 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6265b1bc-308a-49bf-9077-8d14626dc31a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6265b1bc-308a-49bf-9077-8d14626dc31a" (UID: "6265b1bc-308a-49bf-9077-8d14626dc31a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.539059 4886 scope.go:117] "RemoveContainer" containerID="dd2ceb5a1285b2c3176d96a3ab1ab3a68369c87a8d3b736449d70140e379e0d2" Mar 14 09:58:26 crc kubenswrapper[4886]: I0314 09:58:26.627487 4886 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6265b1bc-308a-49bf-9077-8d14626dc31a-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 14 09:58:27 crc kubenswrapper[4886]: I0314 09:58:27.431302 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" path="/var/lib/kubelet/pods/6265b1bc-308a-49bf-9077-8d14626dc31a/volumes" Mar 14 09:58:41 crc kubenswrapper[4886]: I0314 09:58:41.307703 4886 scope.go:117] "RemoveContainer" containerID="783fd0a1eeac25cf22018a8b750437d3a9371e7233b50896f12ea81e1baf1acd" Mar 14 09:58:41 crc kubenswrapper[4886]: I0314 09:58:41.342545 4886 scope.go:117] "RemoveContainer" containerID="1eb06df1507e9613b384ddf0af995416914ce4ba87044c1bb15066f8eebb46ce" Mar 14 09:58:56 crc kubenswrapper[4886]: I0314 09:58:56.066634 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:58:56 crc kubenswrapper[4886]: I0314 09:58:56.067303 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:59:26 crc kubenswrapper[4886]: I0314 09:59:26.065992 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:59:26 crc kubenswrapper[4886]: I0314 09:59:26.066749 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.795620 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btzj6"] Mar 14 09:59:39 crc kubenswrapper[4886]: E0314 09:59:39.796854 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f37eaf-7350-41a1-9e08-3dd75728ec79" containerName="oc" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.796871 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f37eaf-7350-41a1-9e08-3dd75728ec79" containerName="oc" Mar 14 09:59:39 crc kubenswrapper[4886]: E0314 09:59:39.796888 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="gather" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.796895 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="gather" Mar 14 09:59:39 crc kubenswrapper[4886]: E0314 09:59:39.796934 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="copy" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.796940 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="copy" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.797111 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f37eaf-7350-41a1-9e08-3dd75728ec79" containerName="oc" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.797151 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="gather" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.797170 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="6265b1bc-308a-49bf-9077-8d14626dc31a" containerName="copy" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.798632 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.811492 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btzj6"] Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.920544 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-catalog-content\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.920652 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-utilities\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:39 crc kubenswrapper[4886]: I0314 09:59:39.920826 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhxss\" (UniqueName: \"kubernetes.io/projected/48f1bfda-9906-4edd-9200-7e5eec0cdffd-kube-api-access-zhxss\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.023594 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-catalog-content\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.023703 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-utilities\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.023791 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhxss\" (UniqueName: \"kubernetes.io/projected/48f1bfda-9906-4edd-9200-7e5eec0cdffd-kube-api-access-zhxss\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.024246 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-catalog-content\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.024465 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-utilities\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.064164 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhxss\" (UniqueName: \"kubernetes.io/projected/48f1bfda-9906-4edd-9200-7e5eec0cdffd-kube-api-access-zhxss\") pod \"redhat-marketplace-btzj6\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.155850 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:40 crc kubenswrapper[4886]: I0314 09:59:40.652056 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btzj6"] Mar 14 09:59:41 crc kubenswrapper[4886]: I0314 09:59:41.325463 4886 generic.go:334] "Generic (PLEG): container finished" podID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerID="c5b283f42028751119a049571db921a6bb27d1dab92c3b5f9687c95487e48a98" exitCode=0 Mar 14 09:59:41 crc kubenswrapper[4886]: I0314 09:59:41.325542 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerDied","Data":"c5b283f42028751119a049571db921a6bb27d1dab92c3b5f9687c95487e48a98"} Mar 14 09:59:41 crc kubenswrapper[4886]: I0314 09:59:41.325620 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerStarted","Data":"70204d88efe40f29a98bc479ffe3fa6f6f671c6c3b4a2f2df5924e2464b15a85"} Mar 14 09:59:41 crc kubenswrapper[4886]: I0314 09:59:41.499966 4886 scope.go:117] "RemoveContainer" containerID="2dd1691d68a262e871ca81073c2279c75e3b164cb7a5e1a3d2131dc093a63a58" Mar 14 09:59:42 crc kubenswrapper[4886]: I0314 09:59:42.339816 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerStarted","Data":"dd5744c705bb374b450921c0446b489d0b6ee7253bbbd6ace829989f4a72b075"} Mar 14 09:59:43 crc kubenswrapper[4886]: I0314 09:59:43.358217 4886 generic.go:334] "Generic (PLEG): container finished" podID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerID="dd5744c705bb374b450921c0446b489d0b6ee7253bbbd6ace829989f4a72b075" exitCode=0 Mar 14 09:59:43 crc kubenswrapper[4886]: I0314 09:59:43.358587 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerDied","Data":"dd5744c705bb374b450921c0446b489d0b6ee7253bbbd6ace829989f4a72b075"} Mar 14 09:59:44 crc kubenswrapper[4886]: I0314 09:59:44.371344 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerStarted","Data":"32016a726af94a6ad57e30b5d6029b2539c7cfc9e483c4a74850d0eab69765ef"} Mar 14 09:59:44 crc kubenswrapper[4886]: I0314 09:59:44.398341 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btzj6" podStartSLOduration=2.9718144669999997 podStartE2EDuration="5.398317482s" podCreationTimestamp="2026-03-14 09:59:39 +0000 UTC" firstStartedPulling="2026-03-14 09:59:41.32876222 +0000 UTC m=+5516.577213867" lastFinishedPulling="2026-03-14 09:59:43.755265235 +0000 UTC m=+5519.003716882" observedRunningTime="2026-03-14 09:59:44.396980064 +0000 UTC m=+5519.645431711" watchObservedRunningTime="2026-03-14 09:59:44.398317482 +0000 UTC m=+5519.646769129" Mar 14 09:59:50 crc kubenswrapper[4886]: I0314 09:59:50.156018 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:50 crc kubenswrapper[4886]: I0314 09:59:50.157284 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:50 crc kubenswrapper[4886]: I0314 09:59:50.218415 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:50 crc kubenswrapper[4886]: I0314 09:59:50.512145 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:50 crc kubenswrapper[4886]: I0314 09:59:50.569709 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btzj6"] Mar 14 09:59:52 crc kubenswrapper[4886]: I0314 09:59:52.458674 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btzj6" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="registry-server" containerID="cri-o://32016a726af94a6ad57e30b5d6029b2539c7cfc9e483c4a74850d0eab69765ef" gracePeriod=2 Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.473340 4886 generic.go:334] "Generic (PLEG): container finished" podID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerID="32016a726af94a6ad57e30b5d6029b2539c7cfc9e483c4a74850d0eab69765ef" exitCode=0 Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.473516 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerDied","Data":"32016a726af94a6ad57e30b5d6029b2539c7cfc9e483c4a74850d0eab69765ef"} Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.473744 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btzj6" event={"ID":"48f1bfda-9906-4edd-9200-7e5eec0cdffd","Type":"ContainerDied","Data":"70204d88efe40f29a98bc479ffe3fa6f6f671c6c3b4a2f2df5924e2464b15a85"} Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.473762 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70204d88efe40f29a98bc479ffe3fa6f6f671c6c3b4a2f2df5924e2464b15a85" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.520952 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.671519 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhxss\" (UniqueName: \"kubernetes.io/projected/48f1bfda-9906-4edd-9200-7e5eec0cdffd-kube-api-access-zhxss\") pod \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.671822 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-utilities\") pod \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.672009 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-catalog-content\") pod \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\" (UID: \"48f1bfda-9906-4edd-9200-7e5eec0cdffd\") " Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.674288 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-utilities" (OuterVolumeSpecName: "utilities") pod "48f1bfda-9906-4edd-9200-7e5eec0cdffd" (UID: "48f1bfda-9906-4edd-9200-7e5eec0cdffd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.678494 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f1bfda-9906-4edd-9200-7e5eec0cdffd-kube-api-access-zhxss" (OuterVolumeSpecName: "kube-api-access-zhxss") pod "48f1bfda-9906-4edd-9200-7e5eec0cdffd" (UID: "48f1bfda-9906-4edd-9200-7e5eec0cdffd"). InnerVolumeSpecName "kube-api-access-zhxss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.707981 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48f1bfda-9906-4edd-9200-7e5eec0cdffd" (UID: "48f1bfda-9906-4edd-9200-7e5eec0cdffd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.775076 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.775148 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhxss\" (UniqueName: \"kubernetes.io/projected/48f1bfda-9906-4edd-9200-7e5eec0cdffd-kube-api-access-zhxss\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:53 crc kubenswrapper[4886]: I0314 09:59:53.775170 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f1bfda-9906-4edd-9200-7e5eec0cdffd-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 09:59:54 crc kubenswrapper[4886]: I0314 09:59:54.484378 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btzj6" Mar 14 09:59:54 crc kubenswrapper[4886]: I0314 09:59:54.530549 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btzj6"] Mar 14 09:59:54 crc kubenswrapper[4886]: I0314 09:59:54.541995 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btzj6"] Mar 14 09:59:55 crc kubenswrapper[4886]: I0314 09:59:55.438874 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" path="/var/lib/kubelet/pods/48f1bfda-9906-4edd-9200-7e5eec0cdffd/volumes" Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.066030 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.066103 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.066190 4886 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.067011 4886 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a30b0ee975fdbbbf675d125dd6dbe0cfa4a66bc19ea06b9da11724ca219da3c6"} pod="openshift-machine-config-operator/machine-config-daemon-ddctv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.067072 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" containerID="cri-o://a30b0ee975fdbbbf675d125dd6dbe0cfa4a66bc19ea06b9da11724ca219da3c6" gracePeriod=600 Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.507201 4886 generic.go:334] "Generic (PLEG): container finished" podID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerID="a30b0ee975fdbbbf675d125dd6dbe0cfa4a66bc19ea06b9da11724ca219da3c6" exitCode=0 Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.507274 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerDied","Data":"a30b0ee975fdbbbf675d125dd6dbe0cfa4a66bc19ea06b9da11724ca219da3c6"} Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.507604 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" event={"ID":"64517238-bfef-43e1-b543-1eea5b7f9c79","Type":"ContainerStarted","Data":"a74686f30bc0a6fa22736dc136184aaa3ac1969e522bc54eaedfbc0d93100ba3"} Mar 14 09:59:56 crc kubenswrapper[4886]: I0314 09:59:56.507624 4886 scope.go:117] "RemoveContainer" containerID="a2169cc0893022e3b5ca5582507d71df4098963bc0205c4b9f385767e1c7ef8f" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.155203 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr"] Mar 14 10:00:00 crc kubenswrapper[4886]: E0314 10:00:00.157024 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="registry-server" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.157108 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="registry-server" Mar 14 10:00:00 crc kubenswrapper[4886]: E0314 10:00:00.157208 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="extract-utilities" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.157263 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="extract-utilities" Mar 14 10:00:00 crc kubenswrapper[4886]: E0314 10:00:00.157335 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="extract-content" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.157389 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="extract-content" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.157614 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f1bfda-9906-4edd-9200-7e5eec0cdffd" containerName="registry-server" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.158350 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.164318 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.164650 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.166205 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558040-n92xd"] Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.168460 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.173755 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.174038 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.173838 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.176942 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr"] Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.187915 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558040-n92xd"] Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.321640 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2db3301-641d-4165-895b-4a30f71b69d1-secret-volume\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.321697 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2db3301-641d-4165-895b-4a30f71b69d1-config-volume\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.322023 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv7ps\" (UniqueName: \"kubernetes.io/projected/c2db3301-641d-4165-895b-4a30f71b69d1-kube-api-access-mv7ps\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.322235 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9mq\" (UniqueName: \"kubernetes.io/projected/49390a78-e5fd-4501-b01e-d3c0031e7f52-kube-api-access-bb9mq\") pod \"auto-csr-approver-29558040-n92xd\" (UID: \"49390a78-e5fd-4501-b01e-d3c0031e7f52\") " pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.424012 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9mq\" (UniqueName: \"kubernetes.io/projected/49390a78-e5fd-4501-b01e-d3c0031e7f52-kube-api-access-bb9mq\") pod \"auto-csr-approver-29558040-n92xd\" (UID: \"49390a78-e5fd-4501-b01e-d3c0031e7f52\") " pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.424112 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2db3301-641d-4165-895b-4a30f71b69d1-secret-volume\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.424149 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2db3301-641d-4165-895b-4a30f71b69d1-config-volume\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.424217 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv7ps\" (UniqueName: \"kubernetes.io/projected/c2db3301-641d-4165-895b-4a30f71b69d1-kube-api-access-mv7ps\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.424980 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2db3301-641d-4165-895b-4a30f71b69d1-config-volume\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.777376 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2db3301-641d-4165-895b-4a30f71b69d1-secret-volume\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.778239 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9mq\" (UniqueName: \"kubernetes.io/projected/49390a78-e5fd-4501-b01e-d3c0031e7f52-kube-api-access-bb9mq\") pod \"auto-csr-approver-29558040-n92xd\" (UID: \"49390a78-e5fd-4501-b01e-d3c0031e7f52\") " pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.778860 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv7ps\" (UniqueName: \"kubernetes.io/projected/c2db3301-641d-4165-895b-4a30f71b69d1-kube-api-access-mv7ps\") pod \"collect-profiles-29558040-nk2nr\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.800612 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:00 crc kubenswrapper[4886]: I0314 10:00:00.806854 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:01 crc kubenswrapper[4886]: I0314 10:00:01.293426 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr"] Mar 14 10:00:01 crc kubenswrapper[4886]: I0314 10:00:01.379671 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558040-n92xd"] Mar 14 10:00:01 crc kubenswrapper[4886]: W0314 10:00:01.383731 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49390a78_e5fd_4501_b01e_d3c0031e7f52.slice/crio-9dfd5f4859d8f7079400a1eb7292f50ea1bd5cc528fab9a7bf17a043b9ddc830 WatchSource:0}: Error finding container 9dfd5f4859d8f7079400a1eb7292f50ea1bd5cc528fab9a7bf17a043b9ddc830: Status 404 returned error can't find the container with id 9dfd5f4859d8f7079400a1eb7292f50ea1bd5cc528fab9a7bf17a043b9ddc830 Mar 14 10:00:01 crc kubenswrapper[4886]: I0314 10:00:01.555187 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" event={"ID":"c2db3301-641d-4165-895b-4a30f71b69d1","Type":"ContainerStarted","Data":"b0710fc14341d29a44667c20f62b7a60998b47c960666a173c148747152005ed"} Mar 14 10:00:01 crc kubenswrapper[4886]: I0314 10:00:01.555470 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" event={"ID":"c2db3301-641d-4165-895b-4a30f71b69d1","Type":"ContainerStarted","Data":"bfeecad386be53b56475e771b36aaa09b9ee58609c71b22915cecd80ceeabc1f"} Mar 14 10:00:01 crc kubenswrapper[4886]: I0314 10:00:01.556956 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-n92xd" event={"ID":"49390a78-e5fd-4501-b01e-d3c0031e7f52","Type":"ContainerStarted","Data":"9dfd5f4859d8f7079400a1eb7292f50ea1bd5cc528fab9a7bf17a043b9ddc830"} Mar 14 10:00:01 crc kubenswrapper[4886]: I0314 10:00:01.572162 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" podStartSLOduration=1.572141845 podStartE2EDuration="1.572141845s" podCreationTimestamp="2026-03-14 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:00:01.568940165 +0000 UTC m=+5536.817391802" watchObservedRunningTime="2026-03-14 10:00:01.572141845 +0000 UTC m=+5536.820593492" Mar 14 10:00:02 crc kubenswrapper[4886]: I0314 10:00:02.565985 4886 generic.go:334] "Generic (PLEG): container finished" podID="c2db3301-641d-4165-895b-4a30f71b69d1" containerID="b0710fc14341d29a44667c20f62b7a60998b47c960666a173c148747152005ed" exitCode=0 Mar 14 10:00:02 crc kubenswrapper[4886]: I0314 10:00:02.566184 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" event={"ID":"c2db3301-641d-4165-895b-4a30f71b69d1","Type":"ContainerDied","Data":"b0710fc14341d29a44667c20f62b7a60998b47c960666a173c148747152005ed"} Mar 14 10:00:03 crc kubenswrapper[4886]: I0314 10:00:03.958391 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.093372 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2db3301-641d-4165-895b-4a30f71b69d1-config-volume\") pod \"c2db3301-641d-4165-895b-4a30f71b69d1\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.093456 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv7ps\" (UniqueName: \"kubernetes.io/projected/c2db3301-641d-4165-895b-4a30f71b69d1-kube-api-access-mv7ps\") pod \"c2db3301-641d-4165-895b-4a30f71b69d1\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.093636 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2db3301-641d-4165-895b-4a30f71b69d1-secret-volume\") pod \"c2db3301-641d-4165-895b-4a30f71b69d1\" (UID: \"c2db3301-641d-4165-895b-4a30f71b69d1\") " Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.094476 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2db3301-641d-4165-895b-4a30f71b69d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2db3301-641d-4165-895b-4a30f71b69d1" (UID: "c2db3301-641d-4165-895b-4a30f71b69d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.098815 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2db3301-641d-4165-895b-4a30f71b69d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2db3301-641d-4165-895b-4a30f71b69d1" (UID: "c2db3301-641d-4165-895b-4a30f71b69d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.099566 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2db3301-641d-4165-895b-4a30f71b69d1-kube-api-access-mv7ps" (OuterVolumeSpecName: "kube-api-access-mv7ps") pod "c2db3301-641d-4165-895b-4a30f71b69d1" (UID: "c2db3301-641d-4165-895b-4a30f71b69d1"). InnerVolumeSpecName "kube-api-access-mv7ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.196374 4886 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2db3301-641d-4165-895b-4a30f71b69d1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.196404 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv7ps\" (UniqueName: \"kubernetes.io/projected/c2db3301-641d-4165-895b-4a30f71b69d1-kube-api-access-mv7ps\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.196416 4886 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2db3301-641d-4165-895b-4a30f71b69d1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.372280 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck"] Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.387660 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557995-9qdck"] Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.594342 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" event={"ID":"c2db3301-641d-4165-895b-4a30f71b69d1","Type":"ContainerDied","Data":"bfeecad386be53b56475e771b36aaa09b9ee58609c71b22915cecd80ceeabc1f"} Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.594381 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfeecad386be53b56475e771b36aaa09b9ee58609c71b22915cecd80ceeabc1f" Mar 14 10:00:04 crc kubenswrapper[4886]: I0314 10:00:04.594385 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29558040-nk2nr" Mar 14 10:00:05 crc kubenswrapper[4886]: I0314 10:00:05.438444 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93d286d-e146-4189-8377-0b64be26ca42" path="/var/lib/kubelet/pods/b93d286d-e146-4189-8377-0b64be26ca42/volumes" Mar 14 10:00:15 crc kubenswrapper[4886]: I0314 10:00:15.752715 4886 generic.go:334] "Generic (PLEG): container finished" podID="49390a78-e5fd-4501-b01e-d3c0031e7f52" containerID="f1662217cc24f55963b3d4c301906c3f57407c6bde5ce05f04f058eadffdcd86" exitCode=0 Mar 14 10:00:15 crc kubenswrapper[4886]: I0314 10:00:15.752928 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-n92xd" event={"ID":"49390a78-e5fd-4501-b01e-d3c0031e7f52","Type":"ContainerDied","Data":"f1662217cc24f55963b3d4c301906c3f57407c6bde5ce05f04f058eadffdcd86"} Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.175986 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.245885 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb9mq\" (UniqueName: \"kubernetes.io/projected/49390a78-e5fd-4501-b01e-d3c0031e7f52-kube-api-access-bb9mq\") pod \"49390a78-e5fd-4501-b01e-d3c0031e7f52\" (UID: \"49390a78-e5fd-4501-b01e-d3c0031e7f52\") " Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.255472 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49390a78-e5fd-4501-b01e-d3c0031e7f52-kube-api-access-bb9mq" (OuterVolumeSpecName: "kube-api-access-bb9mq") pod "49390a78-e5fd-4501-b01e-d3c0031e7f52" (UID: "49390a78-e5fd-4501-b01e-d3c0031e7f52"). InnerVolumeSpecName "kube-api-access-bb9mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.348977 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb9mq\" (UniqueName: \"kubernetes.io/projected/49390a78-e5fd-4501-b01e-d3c0031e7f52-kube-api-access-bb9mq\") on node \"crc\" DevicePath \"\"" Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.779429 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558040-n92xd" event={"ID":"49390a78-e5fd-4501-b01e-d3c0031e7f52","Type":"ContainerDied","Data":"9dfd5f4859d8f7079400a1eb7292f50ea1bd5cc528fab9a7bf17a043b9ddc830"} Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.779483 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dfd5f4859d8f7079400a1eb7292f50ea1bd5cc528fab9a7bf17a043b9ddc830" Mar 14 10:00:17 crc kubenswrapper[4886]: I0314 10:00:17.779547 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558040-n92xd" Mar 14 10:00:18 crc kubenswrapper[4886]: I0314 10:00:18.260878 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-gmhtq"] Mar 14 10:00:18 crc kubenswrapper[4886]: I0314 10:00:18.272793 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558034-gmhtq"] Mar 14 10:00:19 crc kubenswrapper[4886]: I0314 10:00:19.431794 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede0fdd6-7212-41aa-9033-ac646f81c2de" path="/var/lib/kubelet/pods/ede0fdd6-7212-41aa-9033-ac646f81c2de/volumes" Mar 14 10:00:41 crc kubenswrapper[4886]: I0314 10:00:41.562583 4886 scope.go:117] "RemoveContainer" containerID="45062aa1b54c19c086b403646a340495978d3aa9b8b4fc5552eba675d34a5210" Mar 14 10:00:41 crc kubenswrapper[4886]: I0314 10:00:41.597491 4886 scope.go:117] "RemoveContainer" containerID="e6c77489b03932a5aaf5b1fde18ad56dc49777b4c7f2293d9e90dc84b85c0155" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.162185 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29558041-98svt"] Mar 14 10:01:00 crc kubenswrapper[4886]: E0314 10:01:00.163227 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49390a78-e5fd-4501-b01e-d3c0031e7f52" containerName="oc" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.163249 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="49390a78-e5fd-4501-b01e-d3c0031e7f52" containerName="oc" Mar 14 10:01:00 crc kubenswrapper[4886]: E0314 10:01:00.163331 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2db3301-641d-4165-895b-4a30f71b69d1" containerName="collect-profiles" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.163343 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2db3301-641d-4165-895b-4a30f71b69d1" containerName="collect-profiles" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.163572 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="49390a78-e5fd-4501-b01e-d3c0031e7f52" containerName="oc" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.163592 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2db3301-641d-4165-895b-4a30f71b69d1" containerName="collect-profiles" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.164589 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.179675 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29558041-98svt"] Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.245037 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htd47\" (UniqueName: \"kubernetes.io/projected/bb6864bb-91b3-42e9-b09a-7f09482855b1-kube-api-access-htd47\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.245087 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-combined-ca-bundle\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.245168 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-fernet-keys\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.245206 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-config-data\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.347242 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-fernet-keys\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.347355 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-config-data\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.347556 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htd47\" (UniqueName: \"kubernetes.io/projected/bb6864bb-91b3-42e9-b09a-7f09482855b1-kube-api-access-htd47\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.347661 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-combined-ca-bundle\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.356266 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-fernet-keys\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.357712 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-combined-ca-bundle\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.358766 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-config-data\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.366743 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htd47\" (UniqueName: \"kubernetes.io/projected/bb6864bb-91b3-42e9-b09a-7f09482855b1-kube-api-access-htd47\") pod \"keystone-cron-29558041-98svt\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.506457 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:00 crc kubenswrapper[4886]: I0314 10:01:00.980808 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29558041-98svt"] Mar 14 10:01:02 crc kubenswrapper[4886]: I0314 10:01:02.261839 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-98svt" event={"ID":"bb6864bb-91b3-42e9-b09a-7f09482855b1","Type":"ContainerStarted","Data":"8adc367623b6c46e2fdfe4bc817638ec7168254d3938005eec202ae21ad1ef84"} Mar 14 10:01:03 crc kubenswrapper[4886]: I0314 10:01:03.272223 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-98svt" event={"ID":"bb6864bb-91b3-42e9-b09a-7f09482855b1","Type":"ContainerStarted","Data":"c9236c11b015024408ce4d27a5ec2bd6f323160d1a539ff5a80b183117d7040e"} Mar 14 10:01:03 crc kubenswrapper[4886]: I0314 10:01:03.303720 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29558041-98svt" podStartSLOduration=3.303698475 podStartE2EDuration="3.303698475s" podCreationTimestamp="2026-03-14 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 10:01:03.292062146 +0000 UTC m=+5598.540513823" watchObservedRunningTime="2026-03-14 10:01:03.303698475 +0000 UTC m=+5598.552150122" Mar 14 10:01:05 crc kubenswrapper[4886]: I0314 10:01:05.295568 4886 generic.go:334] "Generic (PLEG): container finished" podID="bb6864bb-91b3-42e9-b09a-7f09482855b1" containerID="c9236c11b015024408ce4d27a5ec2bd6f323160d1a539ff5a80b183117d7040e" exitCode=0 Mar 14 10:01:05 crc kubenswrapper[4886]: I0314 10:01:05.295611 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-98svt" event={"ID":"bb6864bb-91b3-42e9-b09a-7f09482855b1","Type":"ContainerDied","Data":"c9236c11b015024408ce4d27a5ec2bd6f323160d1a539ff5a80b183117d7040e"} Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.635605 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.672932 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-combined-ca-bundle\") pod \"bb6864bb-91b3-42e9-b09a-7f09482855b1\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.672987 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-config-data\") pod \"bb6864bb-91b3-42e9-b09a-7f09482855b1\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.673041 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-fernet-keys\") pod \"bb6864bb-91b3-42e9-b09a-7f09482855b1\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.673246 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htd47\" (UniqueName: \"kubernetes.io/projected/bb6864bb-91b3-42e9-b09a-7f09482855b1-kube-api-access-htd47\") pod \"bb6864bb-91b3-42e9-b09a-7f09482855b1\" (UID: \"bb6864bb-91b3-42e9-b09a-7f09482855b1\") " Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.680350 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bb6864bb-91b3-42e9-b09a-7f09482855b1" (UID: "bb6864bb-91b3-42e9-b09a-7f09482855b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.680854 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6864bb-91b3-42e9-b09a-7f09482855b1-kube-api-access-htd47" (OuterVolumeSpecName: "kube-api-access-htd47") pod "bb6864bb-91b3-42e9-b09a-7f09482855b1" (UID: "bb6864bb-91b3-42e9-b09a-7f09482855b1"). InnerVolumeSpecName "kube-api-access-htd47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.703328 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb6864bb-91b3-42e9-b09a-7f09482855b1" (UID: "bb6864bb-91b3-42e9-b09a-7f09482855b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.737736 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-config-data" (OuterVolumeSpecName: "config-data") pod "bb6864bb-91b3-42e9-b09a-7f09482855b1" (UID: "bb6864bb-91b3-42e9-b09a-7f09482855b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.774994 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htd47\" (UniqueName: \"kubernetes.io/projected/bb6864bb-91b3-42e9-b09a-7f09482855b1-kube-api-access-htd47\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.775028 4886 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.775044 4886 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:06 crc kubenswrapper[4886]: I0314 10:01:06.775054 4886 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb6864bb-91b3-42e9-b09a-7f09482855b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 14 10:01:07 crc kubenswrapper[4886]: I0314 10:01:07.317237 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29558041-98svt" event={"ID":"bb6864bb-91b3-42e9-b09a-7f09482855b1","Type":"ContainerDied","Data":"8adc367623b6c46e2fdfe4bc817638ec7168254d3938005eec202ae21ad1ef84"} Mar 14 10:01:07 crc kubenswrapper[4886]: I0314 10:01:07.317278 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8adc367623b6c46e2fdfe4bc817638ec7168254d3938005eec202ae21ad1ef84" Mar 14 10:01:07 crc kubenswrapper[4886]: I0314 10:01:07.317376 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29558041-98svt" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.713328 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6rhcl"] Mar 14 10:01:52 crc kubenswrapper[4886]: E0314 10:01:52.743654 4886 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6864bb-91b3-42e9-b09a-7f09482855b1" containerName="keystone-cron" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.743725 4886 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6864bb-91b3-42e9-b09a-7f09482855b1" containerName="keystone-cron" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.744999 4886 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6864bb-91b3-42e9-b09a-7f09482855b1" containerName="keystone-cron" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.750920 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rhcl"] Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.751094 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.805060 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-catalog-content\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.805351 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-utilities\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.806558 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv7sd\" (UniqueName: \"kubernetes.io/projected/ced766d8-a50f-4c02-b914-3ce40d39fab6-kube-api-access-rv7sd\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.908235 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-catalog-content\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.908317 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-utilities\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.908515 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv7sd\" (UniqueName: \"kubernetes.io/projected/ced766d8-a50f-4c02-b914-3ce40d39fab6-kube-api-access-rv7sd\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.908647 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-catalog-content\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.908958 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-utilities\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:52 crc kubenswrapper[4886]: I0314 10:01:52.930933 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv7sd\" (UniqueName: \"kubernetes.io/projected/ced766d8-a50f-4c02-b914-3ce40d39fab6-kube-api-access-rv7sd\") pod \"redhat-operators-6rhcl\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:53 crc kubenswrapper[4886]: I0314 10:01:53.085201 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:01:53 crc kubenswrapper[4886]: I0314 10:01:53.532007 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rhcl"] Mar 14 10:01:53 crc kubenswrapper[4886]: I0314 10:01:53.824401 4886 generic.go:334] "Generic (PLEG): container finished" podID="ced766d8-a50f-4c02-b914-3ce40d39fab6" containerID="ba42b717c681c11ea9a44eac9f36d1a688c8ba1bc6013bcc899e775eb2754bda" exitCode=0 Mar 14 10:01:53 crc kubenswrapper[4886]: I0314 10:01:53.824516 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerDied","Data":"ba42b717c681c11ea9a44eac9f36d1a688c8ba1bc6013bcc899e775eb2754bda"} Mar 14 10:01:53 crc kubenswrapper[4886]: I0314 10:01:53.824775 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerStarted","Data":"022ffd4d890a85bcd490cc015d517049068195621ad4c10b06efaa4b22515481"} Mar 14 10:01:54 crc kubenswrapper[4886]: I0314 10:01:54.843153 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerStarted","Data":"bd1897d28a986de0d55211b1580f9ba43ef6ac6b1a0beaab2ff67599ef1bb0e6"} Mar 14 10:01:56 crc kubenswrapper[4886]: I0314 10:01:56.065956 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:01:56 crc kubenswrapper[4886]: I0314 10:01:56.066266 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 14 10:01:59 crc kubenswrapper[4886]: I0314 10:01:59.898171 4886 generic.go:334] "Generic (PLEG): container finished" podID="ced766d8-a50f-4c02-b914-3ce40d39fab6" containerID="bd1897d28a986de0d55211b1580f9ba43ef6ac6b1a0beaab2ff67599ef1bb0e6" exitCode=0 Mar 14 10:01:59 crc kubenswrapper[4886]: I0314 10:01:59.898266 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerDied","Data":"bd1897d28a986de0d55211b1580f9ba43ef6ac6b1a0beaab2ff67599ef1bb0e6"} Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.152675 4886 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29558042-j75bw"] Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.154174 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.156443 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.156638 4886 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.158338 4886 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fcqvp" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.163702 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558042-j75bw"] Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.276521 4886 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp28d\" (UniqueName: \"kubernetes.io/projected/db948804-2551-4fc7-a2b0-2c5640a10dca-kube-api-access-jp28d\") pod \"auto-csr-approver-29558042-j75bw\" (UID: \"db948804-2551-4fc7-a2b0-2c5640a10dca\") " pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.378074 4886 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp28d\" (UniqueName: \"kubernetes.io/projected/db948804-2551-4fc7-a2b0-2c5640a10dca-kube-api-access-jp28d\") pod \"auto-csr-approver-29558042-j75bw\" (UID: \"db948804-2551-4fc7-a2b0-2c5640a10dca\") " pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.404323 4886 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp28d\" (UniqueName: \"kubernetes.io/projected/db948804-2551-4fc7-a2b0-2c5640a10dca-kube-api-access-jp28d\") pod \"auto-csr-approver-29558042-j75bw\" (UID: \"db948804-2551-4fc7-a2b0-2c5640a10dca\") " pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.478081 4886 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:00 crc kubenswrapper[4886]: I0314 10:02:00.980336 4886 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29558042-j75bw"] Mar 14 10:02:00 crc kubenswrapper[4886]: W0314 10:02:00.986433 4886 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb948804_2551_4fc7_a2b0_2c5640a10dca.slice/crio-524a2566510fbd88d9b4da76d5c9c75c3c75b4614669b67b51334a0a12f1e816 WatchSource:0}: Error finding container 524a2566510fbd88d9b4da76d5c9c75c3c75b4614669b67b51334a0a12f1e816: Status 404 returned error can't find the container with id 524a2566510fbd88d9b4da76d5c9c75c3c75b4614669b67b51334a0a12f1e816 Mar 14 10:02:01 crc kubenswrapper[4886]: I0314 10:02:01.924789 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerStarted","Data":"1abe596f12eacb46943223a511ad129a00c56c4b9f89eb5985b5d20450dcdbec"} Mar 14 10:02:01 crc kubenswrapper[4886]: I0314 10:02:01.927299 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558042-j75bw" event={"ID":"db948804-2551-4fc7-a2b0-2c5640a10dca","Type":"ContainerStarted","Data":"524a2566510fbd88d9b4da76d5c9c75c3c75b4614669b67b51334a0a12f1e816"} Mar 14 10:02:01 crc kubenswrapper[4886]: I0314 10:02:01.950949 4886 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6rhcl" podStartSLOduration=3.045585162 podStartE2EDuration="9.950932465s" podCreationTimestamp="2026-03-14 10:01:52 +0000 UTC" firstStartedPulling="2026-03-14 10:01:53.82700911 +0000 UTC m=+5649.075460757" lastFinishedPulling="2026-03-14 10:02:00.732356423 +0000 UTC m=+5655.980808060" observedRunningTime="2026-03-14 10:02:01.945584244 +0000 UTC m=+5657.194035881" watchObservedRunningTime="2026-03-14 10:02:01.950932465 +0000 UTC m=+5657.199384102" Mar 14 10:02:02 crc kubenswrapper[4886]: I0314 10:02:02.937539 4886 generic.go:334] "Generic (PLEG): container finished" podID="db948804-2551-4fc7-a2b0-2c5640a10dca" containerID="f0fb676527724980fefa61de9f47c3b8c5702f8bc775d5824f75162de7b80eb2" exitCode=0 Mar 14 10:02:02 crc kubenswrapper[4886]: I0314 10:02:02.937677 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558042-j75bw" event={"ID":"db948804-2551-4fc7-a2b0-2c5640a10dca","Type":"ContainerDied","Data":"f0fb676527724980fefa61de9f47c3b8c5702f8bc775d5824f75162de7b80eb2"} Mar 14 10:02:03 crc kubenswrapper[4886]: I0314 10:02:03.085998 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:02:03 crc kubenswrapper[4886]: I0314 10:02:03.086051 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.152691 4886 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6rhcl" podUID="ced766d8-a50f-4c02-b914-3ce40d39fab6" containerName="registry-server" probeResult="failure" output=< Mar 14 10:02:04 crc kubenswrapper[4886]: timeout: failed to connect service ":50051" within 1s Mar 14 10:02:04 crc kubenswrapper[4886]: > Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.289675 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.372930 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp28d\" (UniqueName: \"kubernetes.io/projected/db948804-2551-4fc7-a2b0-2c5640a10dca-kube-api-access-jp28d\") pod \"db948804-2551-4fc7-a2b0-2c5640a10dca\" (UID: \"db948804-2551-4fc7-a2b0-2c5640a10dca\") " Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.387834 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db948804-2551-4fc7-a2b0-2c5640a10dca-kube-api-access-jp28d" (OuterVolumeSpecName: "kube-api-access-jp28d") pod "db948804-2551-4fc7-a2b0-2c5640a10dca" (UID: "db948804-2551-4fc7-a2b0-2c5640a10dca"). InnerVolumeSpecName "kube-api-access-jp28d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.475815 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp28d\" (UniqueName: \"kubernetes.io/projected/db948804-2551-4fc7-a2b0-2c5640a10dca-kube-api-access-jp28d\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.956068 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29558042-j75bw" event={"ID":"db948804-2551-4fc7-a2b0-2c5640a10dca","Type":"ContainerDied","Data":"524a2566510fbd88d9b4da76d5c9c75c3c75b4614669b67b51334a0a12f1e816"} Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.956539 4886 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524a2566510fbd88d9b4da76d5c9c75c3c75b4614669b67b51334a0a12f1e816" Mar 14 10:02:04 crc kubenswrapper[4886]: I0314 10:02:04.956217 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29558042-j75bw" Mar 14 10:02:05 crc kubenswrapper[4886]: I0314 10:02:05.366716 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-ds65w"] Mar 14 10:02:05 crc kubenswrapper[4886]: I0314 10:02:05.376620 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29558036-ds65w"] Mar 14 10:02:05 crc kubenswrapper[4886]: I0314 10:02:05.430946 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f5d35e-101a-4e0d-81fa-3011d175b80a" path="/var/lib/kubelet/pods/41f5d35e-101a-4e0d-81fa-3011d175b80a/volumes" Mar 14 10:02:13 crc kubenswrapper[4886]: I0314 10:02:13.139881 4886 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:02:13 crc kubenswrapper[4886]: I0314 10:02:13.204472 4886 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:02:16 crc kubenswrapper[4886]: I0314 10:02:16.632772 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rhcl"] Mar 14 10:02:16 crc kubenswrapper[4886]: I0314 10:02:16.633547 4886 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6rhcl" podUID="ced766d8-a50f-4c02-b914-3ce40d39fab6" containerName="registry-server" containerID="cri-o://1abe596f12eacb46943223a511ad129a00c56c4b9f89eb5985b5d20450dcdbec" gracePeriod=2 Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.093921 4886 generic.go:334] "Generic (PLEG): container finished" podID="ced766d8-a50f-4c02-b914-3ce40d39fab6" containerID="1abe596f12eacb46943223a511ad129a00c56c4b9f89eb5985b5d20450dcdbec" exitCode=0 Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.094017 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerDied","Data":"1abe596f12eacb46943223a511ad129a00c56c4b9f89eb5985b5d20450dcdbec"} Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.424372 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.564954 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv7sd\" (UniqueName: \"kubernetes.io/projected/ced766d8-a50f-4c02-b914-3ce40d39fab6-kube-api-access-rv7sd\") pod \"ced766d8-a50f-4c02-b914-3ce40d39fab6\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.565084 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-catalog-content\") pod \"ced766d8-a50f-4c02-b914-3ce40d39fab6\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.565507 4886 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-utilities\") pod \"ced766d8-a50f-4c02-b914-3ce40d39fab6\" (UID: \"ced766d8-a50f-4c02-b914-3ce40d39fab6\") " Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.566153 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-utilities" (OuterVolumeSpecName: "utilities") pod "ced766d8-a50f-4c02-b914-3ce40d39fab6" (UID: "ced766d8-a50f-4c02-b914-3ce40d39fab6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.577347 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced766d8-a50f-4c02-b914-3ce40d39fab6-kube-api-access-rv7sd" (OuterVolumeSpecName: "kube-api-access-rv7sd") pod "ced766d8-a50f-4c02-b914-3ce40d39fab6" (UID: "ced766d8-a50f-4c02-b914-3ce40d39fab6"). InnerVolumeSpecName "kube-api-access-rv7sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.668167 4886 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv7sd\" (UniqueName: \"kubernetes.io/projected/ced766d8-a50f-4c02-b914-3ce40d39fab6-kube-api-access-rv7sd\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.668226 4886 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-utilities\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.685111 4886 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ced766d8-a50f-4c02-b914-3ce40d39fab6" (UID: "ced766d8-a50f-4c02-b914-3ce40d39fab6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 14 10:02:17 crc kubenswrapper[4886]: I0314 10:02:17.772000 4886 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced766d8-a50f-4c02-b914-3ce40d39fab6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.105612 4886 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rhcl" event={"ID":"ced766d8-a50f-4c02-b914-3ce40d39fab6","Type":"ContainerDied","Data":"022ffd4d890a85bcd490cc015d517049068195621ad4c10b06efaa4b22515481"} Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.105665 4886 scope.go:117] "RemoveContainer" containerID="1abe596f12eacb46943223a511ad129a00c56c4b9f89eb5985b5d20450dcdbec" Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.105780 4886 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rhcl" Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.136537 4886 scope.go:117] "RemoveContainer" containerID="bd1897d28a986de0d55211b1580f9ba43ef6ac6b1a0beaab2ff67599ef1bb0e6" Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.142870 4886 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rhcl"] Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.155902 4886 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6rhcl"] Mar 14 10:02:18 crc kubenswrapper[4886]: I0314 10:02:18.161324 4886 scope.go:117] "RemoveContainer" containerID="ba42b717c681c11ea9a44eac9f36d1a688c8ba1bc6013bcc899e775eb2754bda" Mar 14 10:02:19 crc kubenswrapper[4886]: I0314 10:02:19.432367 4886 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced766d8-a50f-4c02-b914-3ce40d39fab6" path="/var/lib/kubelet/pods/ced766d8-a50f-4c02-b914-3ce40d39fab6/volumes" Mar 14 10:02:26 crc kubenswrapper[4886]: I0314 10:02:26.066164 4886 patch_prober.go:28] interesting pod/machine-config-daemon-ddctv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 14 10:02:26 crc kubenswrapper[4886]: I0314 10:02:26.067988 4886 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ddctv" podUID="64517238-bfef-43e1-b543-1eea5b7f9c79" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"